运行caffe框架中的cifar10样例

2017/5/23 posted in  caffe框架学习 基础知识

1.先运行caffe目录下的data/get_cifar10.sh脚本.

#!/usr/bin/env sh
# This scripts downloads the CIFAR10 (binary version) data and unzips it.

DIR="$( cd "$(dirname "$0")" ; pwd -P )"
cd "$DIR"

echo "Downloading..."

wget --no-check-certificate http://www.cs.toronto.edu/~kriz/cifar-10-binary.tar.gz

echo "Unzipping..."

tar -xf cifar-10-binary.tar.gz && rm -f cifar-10-binary.tar.gz
mv cifar-10-batches-bin/* . && rm -rf cifar-10-batches-bin

# Creation is split out because leveldb sometimes causes segfault
# and needs to be re-created.

echo "Done."

获取数据集.

之后运行example下的cifar10/create_cifar10.sh
但是会遇到以下报错:

这里要运行下面这个命令:

install_name_tool -add_rpath '/Users/liangzhonghao/anaconda2/lib'  /usr/local/Cellar/caffe/build/examples/cifar10/convert_cifar_data.bin

再次运行./examples/cifar10/create_cifar10.sh后又会出现一个错误:

这里要再次执行以下命令:

install_name_tool -add_rpath '/Users/liangzhonghao/anaconda2/lib'  /usr/local/Cellar/caffe/build/tools/compute_image_mean

然后再次执行:

./examples/cifar10/create_cifar10.sh

结果成功了!如下图所示:

Training and Testing the "Quick" Model

因为例子中已经给出定义好的protobuf和solver protobuf文件,所以我们直接运行train_quick.sh

该文件内容为:

#!/usr/bin/env sh
set -e

TOOLS=./build/tools

$TOOLS/caffe train \
  --solver=examples/cifar10/cifar10_quick_solver.prototxt $@

# reduce learning rate by factor of 10 after 8 epochs
$TOOLS/caffe train \
  --solver=examples/cifar10/cifar10_quick_solver_lr1.prototxt \
  --snapshot=examples/cifar10/cifar10_quick_iter_4000.solverstate $@

执行如下命令:

➜ caffe git:(master) ✗ ./examples/cifar10/train_quick.sh

然后输出为:

I0523 15:43:36.608793 2712679360 caffe.cpp:211] Use CPU.
I0523 15:43:36.609737 2712679360 solver.cpp:44] Initializing solver from parameters:
test_iter: 100
test_interval: 500
base_lr: 0.001
display: 100
max_iter: 4000
lr_policy: "fixed"
momentum: 0.9
weight_decay: 0.004
snapshot: 4000
snapshot_prefix: "examples/cifar10/cifar10_quick"
solver_mode: CPU
net: "examples/cifar10/cifar10_quick_train_test.prototxt"
train_state {
  level: 0
  stage: ""
}
I0523 15:43:36.610075 2712679360 solver.cpp:87] Creating training net from net file: examples/cifar10/cifar10_quick_train_test.prototxt
I0523 15:43:36.610931 2712679360 net.cpp:294] The NetState phase (0) differed from the phase (1) specified by a rule in layer cifar
I0523 15:43:36.610961 2712679360 net.cpp:294] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy
I0523 15:43:36.610966 2712679360 net.cpp:51] Initializing net from parameters:
name: "CIFAR10_quick"
state {
  phase: TRAIN
  level: 0
  stage: ""
}
layer {
  name: "cifar"
  type: "Data"
  top: "data"
  top: "label"
  include {
    phase: TRAIN
  }
  transform_param {
    mean_file: "examples/cifar10/mean.binaryproto"
  }
  data_param {
    source: "examples/cifar10/cifar10_train_lmdb"
    batch_size: 100
    backend: LMDB
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.0001
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "pool1"
  top: "pool1"
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 64
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "pool3"
  type: "Pooling"
  bottom: "conv3"
  top: "pool3"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "ip1"
  type: "InnerProduct"
  bottom: "pool3"
  top: "ip1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 64
    weight_filler {
      type: "gaussian"
      std: 0.1
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "ip2"
  type: "InnerProduct"
  bottom: "ip1"
  top: "ip2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 10
    weight_filler {
      type: "gaussian"
      std: 0.1
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip2"
  bottom: "label"
  top: "loss"
}
I0523 15:43:36.611205 2712679360 layer_factory.hpp:77] Creating layer cifar
I0523 15:43:36.611467 2712679360 db_lmdb.cpp:35] Opened lmdb examples/cifar10/cifar10_train_lmdb
I0523 15:43:36.611524 2712679360 net.cpp:84] Creating Layer cifar
I0523 15:43:36.611531 2712679360 net.cpp:380] cifar -> data
I0523 15:43:36.611549 2712679360 net.cpp:380] cifar -> label
I0523 15:43:36.611565 2712679360 data_transformer.cpp:25] Loading mean file from: examples/cifar10/mean.binaryproto
I0523 15:43:36.611686 2712679360 data_layer.cpp:45] output data size: 100,3,32,32
I0523 15:43:36.617992 2712679360 net.cpp:122] Setting up cifar
I0523 15:43:36.618022 2712679360 net.cpp:129] Top shape: 100 3 32 32 (307200)
I0523 15:43:36.618028 2712679360 net.cpp:129] Top shape: 100 (100)
I0523 15:43:36.618032 2712679360 net.cpp:137] Memory required for data: 1229200
I0523 15:43:36.618041 2712679360 layer_factory.hpp:77] Creating layer conv1
I0523 15:43:36.618052 2712679360 net.cpp:84] Creating Layer conv1
I0523 15:43:36.618057 2712679360 net.cpp:406] conv1 <- data
I0523 15:43:36.618063 2712679360 net.cpp:380] conv1 -> conv1
I0523 15:43:36.618175 2712679360 net.cpp:122] Setting up conv1
I0523 15:43:36.618180 2712679360 net.cpp:129] Top shape: 100 32 32 32 (3276800)
I0523 15:43:36.618185 2712679360 net.cpp:137] Memory required for data: 14336400
I0523 15:43:36.618192 2712679360 layer_factory.hpp:77] Creating layer pool1
I0523 15:43:36.618199 2712679360 net.cpp:84] Creating Layer pool1
I0523 15:43:36.618202 2712679360 net.cpp:406] pool1 <- conv1
I0523 15:43:36.618206 2712679360 net.cpp:380] pool1 -> pool1
I0523 15:43:36.618216 2712679360 net.cpp:122] Setting up pool1
I0523 15:43:36.618219 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 15:43:36.618224 2712679360 net.cpp:137] Memory required for data: 17613200
I0523 15:43:36.618228 2712679360 layer_factory.hpp:77] Creating layer relu1
I0523 15:43:36.618234 2712679360 net.cpp:84] Creating Layer relu1
I0523 15:43:36.618238 2712679360 net.cpp:406] relu1 <- pool1
I0523 15:43:36.618242 2712679360 net.cpp:367] relu1 -> pool1 (in-place)
I0523 15:43:36.618247 2712679360 net.cpp:122] Setting up relu1
I0523 15:43:36.618250 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 15:43:36.618255 2712679360 net.cpp:137] Memory required for data: 20890000
I0523 15:43:36.618263 2712679360 layer_factory.hpp:77] Creating layer conv2
I0523 15:43:36.618273 2712679360 net.cpp:84] Creating Layer conv2
I0523 15:43:36.618276 2712679360 net.cpp:406] conv2 <- pool1
I0523 15:43:36.618281 2712679360 net.cpp:380] conv2 -> conv2
I0523 15:43:36.618585 2712679360 net.cpp:122] Setting up conv2
I0523 15:43:36.618592 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 15:43:36.618597 2712679360 net.cpp:137] Memory required for data: 24166800
I0523 15:43:36.618602 2712679360 layer_factory.hpp:77] Creating layer relu2
I0523 15:43:36.618607 2712679360 net.cpp:84] Creating Layer relu2
I0523 15:43:36.618609 2712679360 net.cpp:406] relu2 <- conv2
I0523 15:43:36.618614 2712679360 net.cpp:367] relu2 -> conv2 (in-place)
I0523 15:43:36.618619 2712679360 net.cpp:122] Setting up relu2
I0523 15:43:36.618623 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 15:43:36.618628 2712679360 net.cpp:137] Memory required for data: 27443600
I0523 15:43:36.618630 2712679360 layer_factory.hpp:77] Creating layer pool2
I0523 15:43:36.618634 2712679360 net.cpp:84] Creating Layer pool2
I0523 15:43:36.618638 2712679360 net.cpp:406] pool2 <- conv2
I0523 15:43:36.618643 2712679360 net.cpp:380] pool2 -> pool2
I0523 15:43:36.618647 2712679360 net.cpp:122] Setting up pool2
I0523 15:43:36.618654 2712679360 net.cpp:129] Top shape: 100 32 8 8 (204800)
I0523 15:43:36.618662 2712679360 net.cpp:137] Memory required for data: 28262800
I0523 15:43:36.618669 2712679360 layer_factory.hpp:77] Creating layer conv3
I0523 15:43:36.618680 2712679360 net.cpp:84] Creating Layer conv3
I0523 15:43:36.618685 2712679360 net.cpp:406] conv3 <- pool2
I0523 15:43:36.618695 2712679360 net.cpp:380] conv3 -> conv3
I0523 15:43:36.619361 2712679360 net.cpp:122] Setting up conv3
I0523 15:43:36.619372 2712679360 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0523 15:43:36.619379 2712679360 net.cpp:137] Memory required for data: 29901200
I0523 15:43:36.619385 2712679360 layer_factory.hpp:77] Creating layer relu3
I0523 15:43:36.619390 2712679360 net.cpp:84] Creating Layer relu3
I0523 15:43:36.619393 2712679360 net.cpp:406] relu3 <- conv3
I0523 15:43:36.619398 2712679360 net.cpp:367] relu3 -> conv3 (in-place)
I0523 15:43:36.619403 2712679360 net.cpp:122] Setting up relu3
I0523 15:43:36.619447 2712679360 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0523 15:43:36.619459 2712679360 net.cpp:137] Memory required for data: 31539600
I0523 15:43:36.619467 2712679360 layer_factory.hpp:77] Creating layer pool3
I0523 15:43:36.619477 2712679360 net.cpp:84] Creating Layer pool3
I0523 15:43:36.619484 2712679360 net.cpp:406] pool3 <- conv3
I0523 15:43:36.619493 2712679360 net.cpp:380] pool3 -> pool3
I0523 15:43:36.619505 2712679360 net.cpp:122] Setting up pool3
I0523 15:43:36.619513 2712679360 net.cpp:129] Top shape: 100 64 4 4 (102400)
I0523 15:43:36.619523 2712679360 net.cpp:137] Memory required for data: 31949200
I0523 15:43:36.619529 2712679360 layer_factory.hpp:77] Creating layer ip1
I0523 15:43:36.619539 2712679360 net.cpp:84] Creating Layer ip1
I0523 15:43:36.619546 2712679360 net.cpp:406] ip1 <- pool3
I0523 15:43:36.619555 2712679360 net.cpp:380] ip1 -> ip1
I0523 15:43:36.620586 2712679360 net.cpp:122] Setting up ip1
I0523 15:43:36.620602 2712679360 net.cpp:129] Top shape: 100 64 (6400)
I0523 15:43:36.620607 2712679360 net.cpp:137] Memory required for data: 31974800
I0523 15:43:36.620613 2712679360 layer_factory.hpp:77] Creating layer ip2
I0523 15:43:36.620620 2712679360 net.cpp:84] Creating Layer ip2
I0523 15:43:36.620625 2712679360 net.cpp:406] ip2 <- ip1
I0523 15:43:36.620630 2712679360 net.cpp:380] ip2 -> ip2
I0523 15:43:36.620649 2712679360 net.cpp:122] Setting up ip2
I0523 15:43:36.620656 2712679360 net.cpp:129] Top shape: 100 10 (1000)
I0523 15:43:36.620662 2712679360 net.cpp:137] Memory required for data: 31978800
I0523 15:43:36.620673 2712679360 layer_factory.hpp:77] Creating layer loss
I0523 15:43:36.620682 2712679360 net.cpp:84] Creating Layer loss
I0523 15:43:36.620689 2712679360 net.cpp:406] loss <- ip2
I0523 15:43:36.620697 2712679360 net.cpp:406] loss <- label
I0523 15:43:36.620703 2712679360 net.cpp:380] loss -> loss
I0523 15:43:36.620730 2712679360 layer_factory.hpp:77] Creating layer loss
I0523 15:43:36.620749 2712679360 net.cpp:122] Setting up loss
I0523 15:43:36.620756 2712679360 net.cpp:129] Top shape: (1)
I0523 15:43:36.620764 2712679360 net.cpp:132]     with loss weight 1
I0523 15:43:36.620787 2712679360 net.cpp:137] Memory required for data: 31978804
I0523 15:43:36.620795 2712679360 net.cpp:198] loss needs backward computation.
I0523 15:43:36.620800 2712679360 net.cpp:198] ip2 needs backward computation.
I0523 15:43:36.620807 2712679360 net.cpp:198] ip1 needs backward computation.
I0523 15:43:36.620813 2712679360 net.cpp:198] pool3 needs backward computation.
I0523 15:43:36.620820 2712679360 net.cpp:198] relu3 needs backward computation.
I0523 15:43:36.620832 2712679360 net.cpp:198] conv3 needs backward computation.
I0523 15:43:36.620851 2712679360 net.cpp:198] pool2 needs backward computation.
I0523 15:43:36.620859 2712679360 net.cpp:198] relu2 needs backward computation.
I0523 15:43:36.620867 2712679360 net.cpp:198] conv2 needs backward computation.
I0523 15:43:36.620875 2712679360 net.cpp:198] relu1 needs backward computation.
I0523 15:43:36.620882 2712679360 net.cpp:198] pool1 needs backward computation.
I0523 15:43:36.620889 2712679360 net.cpp:198] conv1 needs backward computation.
I0523 15:43:36.620896 2712679360 net.cpp:200] cifar does not need backward computation.
I0523 15:43:36.620904 2712679360 net.cpp:242] This network produces output loss
I0523 15:43:36.620916 2712679360 net.cpp:255] Network initialization done.
I0523 15:43:36.621170 2712679360 solver.cpp:172] Creating test net (#0) specified by net file: examples/cifar10/cifar10_quick_train_test.prototxt
I0523 15:43:36.621199 2712679360 net.cpp:294] The NetState phase (1) differed from the phase (0) specified by a rule in layer cifar
I0523 15:43:36.621210 2712679360 net.cpp:51] Initializing net from parameters:
name: "CIFAR10_quick"
state {
  phase: TEST
}
layer {
  name: "cifar"
  type: "Data"
  top: "data"
  top: "label"
  include {
    phase: TEST
  }
  transform_param {
    mean_file: "examples/cifar10/mean.binaryproto"
  }
  data_param {
    source: "examples/cifar10/cifar10_test_lmdb"
    batch_size: 100
    backend: LMDB
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.0001
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "pool1"
  top: "pool1"
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 64
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "pool3"
  type: "Pooling"
  bottom: "conv3"
  top: "pool3"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "ip1"
  type: "InnerProduct"
  bottom: "pool3"
  top: "ip1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 64
    weight_filler {
      type: "gaussian"
      std: 0.1
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "ip2"
  type: "InnerProduct"
  bottom: "ip1"
  top: "ip2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 10
    weight_filler {
      type: "gaussian"
      std: 0.1
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "accuracy"
  type: "Accuracy"
  bottom: "ip2"
  bottom: "label"
  top: "accuracy"
  include {
    phase: TEST
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip2"
  bottom: "label"
  top: "loss"
}
I0523 15:43:36.621821 2712679360 layer_factory.hpp:77] Creating layer cifar
I0523 15:43:36.621913 2712679360 db_lmdb.cpp:35] Opened lmdb examples/cifar10/cifar10_test_lmdb
I0523 15:43:36.621933 2712679360 net.cpp:84] Creating Layer cifar
I0523 15:43:36.621943 2712679360 net.cpp:380] cifar -> data
I0523 15:43:36.621950 2712679360 net.cpp:380] cifar -> label
I0523 15:43:36.621958 2712679360 data_transformer.cpp:25] Loading mean file from: examples/cifar10/mean.binaryproto
I0523 15:43:36.622017 2712679360 data_layer.cpp:45] output data size: 100,3,32,32
I0523 15:43:36.624790 2712679360 net.cpp:122] Setting up cifar
I0523 15:43:36.624822 2712679360 net.cpp:129] Top shape: 100 3 32 32 (307200)
I0523 15:43:36.624830 2712679360 net.cpp:129] Top shape: 100 (100)
I0523 15:43:36.624835 2712679360 net.cpp:137] Memory required for data: 1229200
I0523 15:43:36.624840 2712679360 layer_factory.hpp:77] Creating layer label_cifar_1_split
I0523 15:43:36.624851 2712679360 net.cpp:84] Creating Layer label_cifar_1_split
I0523 15:43:36.624856 2712679360 net.cpp:406] label_cifar_1_split <- label
I0523 15:43:36.624862 2712679360 net.cpp:380] label_cifar_1_split -> label_cifar_1_split_0
I0523 15:43:36.624869 2712679360 net.cpp:380] label_cifar_1_split -> label_cifar_1_split_1
I0523 15:43:36.624876 2712679360 net.cpp:122] Setting up label_cifar_1_split
I0523 15:43:36.624878 2712679360 net.cpp:129] Top shape: 100 (100)
I0523 15:43:36.624882 2712679360 net.cpp:129] Top shape: 100 (100)
I0523 15:43:36.624886 2712679360 net.cpp:137] Memory required for data: 1230000
I0523 15:43:36.624917 2712679360 layer_factory.hpp:77] Creating layer conv1
I0523 15:43:36.624927 2712679360 net.cpp:84] Creating Layer conv1
I0523 15:43:36.624930 2712679360 net.cpp:406] conv1 <- data
I0523 15:43:36.624935 2712679360 net.cpp:380] conv1 -> conv1
I0523 15:43:36.624987 2712679360 net.cpp:122] Setting up conv1
I0523 15:43:36.624991 2712679360 net.cpp:129] Top shape: 100 32 32 32 (3276800)
I0523 15:43:36.624996 2712679360 net.cpp:137] Memory required for data: 14337200
I0523 15:43:36.625002 2712679360 layer_factory.hpp:77] Creating layer pool1
I0523 15:43:36.625008 2712679360 net.cpp:84] Creating Layer pool1
I0523 15:43:36.625011 2712679360 net.cpp:406] pool1 <- conv1
I0523 15:43:36.625015 2712679360 net.cpp:380] pool1 -> pool1
I0523 15:43:36.625022 2712679360 net.cpp:122] Setting up pool1
I0523 15:43:36.625026 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 15:43:36.625031 2712679360 net.cpp:137] Memory required for data: 17614000
I0523 15:43:36.625036 2712679360 layer_factory.hpp:77] Creating layer relu1
I0523 15:43:36.625041 2712679360 net.cpp:84] Creating Layer relu1
I0523 15:43:36.625043 2712679360 net.cpp:406] relu1 <- pool1
I0523 15:43:36.625048 2712679360 net.cpp:367] relu1 -> pool1 (in-place)
I0523 15:43:36.625053 2712679360 net.cpp:122] Setting up relu1
I0523 15:43:36.625056 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 15:43:36.625061 2712679360 net.cpp:137] Memory required for data: 20890800
I0523 15:43:36.625064 2712679360 layer_factory.hpp:77] Creating layer conv2
I0523 15:43:36.625071 2712679360 net.cpp:84] Creating Layer conv2
I0523 15:43:36.625074 2712679360 net.cpp:406] conv2 <- pool1
I0523 15:43:36.625084 2712679360 net.cpp:380] conv2 -> conv2
I0523 15:43:36.625396 2712679360 net.cpp:122] Setting up conv2
I0523 15:43:36.625402 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 15:43:36.625407 2712679360 net.cpp:137] Memory required for data: 24167600
I0523 15:43:36.625412 2712679360 layer_factory.hpp:77] Creating layer relu2
I0523 15:43:36.625417 2712679360 net.cpp:84] Creating Layer relu2
I0523 15:43:36.625422 2712679360 net.cpp:406] relu2 <- conv2
I0523 15:43:36.625425 2712679360 net.cpp:367] relu2 -> conv2 (in-place)
I0523 15:43:36.625429 2712679360 net.cpp:122] Setting up relu2
I0523 15:43:36.625433 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 15:43:36.625437 2712679360 net.cpp:137] Memory required for data: 27444400
I0523 15:43:36.625440 2712679360 layer_factory.hpp:77] Creating layer pool2
I0523 15:43:36.625445 2712679360 net.cpp:84] Creating Layer pool2
I0523 15:43:36.625448 2712679360 net.cpp:406] pool2 <- conv2
I0523 15:43:36.625452 2712679360 net.cpp:380] pool2 -> pool2
I0523 15:43:36.625458 2712679360 net.cpp:122] Setting up pool2
I0523 15:43:36.625460 2712679360 net.cpp:129] Top shape: 100 32 8 8 (204800)
I0523 15:43:36.625464 2712679360 net.cpp:137] Memory required for data: 28263600
I0523 15:43:36.625468 2712679360 layer_factory.hpp:77] Creating layer conv3
I0523 15:43:36.625474 2712679360 net.cpp:84] Creating Layer conv3
I0523 15:43:36.625479 2712679360 net.cpp:406] conv3 <- pool2
I0523 15:43:36.625483 2712679360 net.cpp:380] conv3 -> conv3
I0523 15:43:36.626077 2712679360 net.cpp:122] Setting up conv3
I0523 15:43:36.626083 2712679360 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0523 15:43:36.626088 2712679360 net.cpp:137] Memory required for data: 29902000
I0523 15:43:36.626093 2712679360 layer_factory.hpp:77] Creating layer relu3
I0523 15:43:36.626098 2712679360 net.cpp:84] Creating Layer relu3
I0523 15:43:36.626101 2712679360 net.cpp:406] relu3 <- conv3
I0523 15:43:36.626106 2712679360 net.cpp:367] relu3 -> conv3 (in-place)
I0523 15:43:36.626111 2712679360 net.cpp:122] Setting up relu3
I0523 15:43:36.626113 2712679360 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0523 15:43:36.626117 2712679360 net.cpp:137] Memory required for data: 31540400
I0523 15:43:36.626121 2712679360 layer_factory.hpp:77] Creating layer pool3
I0523 15:43:36.626126 2712679360 net.cpp:84] Creating Layer pool3
I0523 15:43:36.626129 2712679360 net.cpp:406] pool3 <- conv3
I0523 15:43:36.626145 2712679360 net.cpp:380] pool3 -> pool3
I0523 15:43:36.626152 2712679360 net.cpp:122] Setting up pool3
I0523 15:43:36.626154 2712679360 net.cpp:129] Top shape: 100 64 4 4 (102400)
I0523 15:43:36.626159 2712679360 net.cpp:137] Memory required for data: 31950000
I0523 15:43:36.626163 2712679360 layer_factory.hpp:77] Creating layer ip1
I0523 15:43:36.626168 2712679360 net.cpp:84] Creating Layer ip1
I0523 15:43:36.626173 2712679360 net.cpp:406] ip1 <- pool3
I0523 15:43:36.626176 2712679360 net.cpp:380] ip1 -> ip1
I0523 15:43:36.626969 2712679360 net.cpp:122] Setting up ip1
I0523 15:43:36.626981 2712679360 net.cpp:129] Top shape: 100 64 (6400)
I0523 15:43:36.626986 2712679360 net.cpp:137] Memory required for data: 31975600
I0523 15:43:36.626992 2712679360 layer_factory.hpp:77] Creating layer ip2
I0523 15:43:36.626999 2712679360 net.cpp:84] Creating Layer ip2
I0523 15:43:36.627003 2712679360 net.cpp:406] ip2 <- ip1
I0523 15:43:36.627008 2712679360 net.cpp:380] ip2 -> ip2
I0523 15:43:36.627024 2712679360 net.cpp:122] Setting up ip2
I0523 15:43:36.627028 2712679360 net.cpp:129] Top shape: 100 10 (1000)
I0523 15:43:36.627032 2712679360 net.cpp:137] Memory required for data: 31979600
I0523 15:43:36.627039 2712679360 layer_factory.hpp:77] Creating layer ip2_ip2_0_split
I0523 15:43:36.627046 2712679360 net.cpp:84] Creating Layer ip2_ip2_0_split
I0523 15:43:36.627053 2712679360 net.cpp:406] ip2_ip2_0_split <- ip2
I0523 15:43:36.627059 2712679360 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_0
I0523 15:43:36.627068 2712679360 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_1
I0523 15:43:36.627076 2712679360 net.cpp:122] Setting up ip2_ip2_0_split
I0523 15:43:36.627081 2712679360 net.cpp:129] Top shape: 100 10 (1000)
I0523 15:43:36.627085 2712679360 net.cpp:129] Top shape: 100 10 (1000)
I0523 15:43:36.627089 2712679360 net.cpp:137] Memory required for data: 31987600
I0523 15:43:36.627094 2712679360 layer_factory.hpp:77] Creating layer accuracy
I0523 15:43:36.627099 2712679360 net.cpp:84] Creating Layer accuracy
I0523 15:43:36.627102 2712679360 net.cpp:406] accuracy <- ip2_ip2_0_split_0
I0523 15:43:36.627106 2712679360 net.cpp:406] accuracy <- label_cifar_1_split_0
I0523 15:43:36.627110 2712679360 net.cpp:380] accuracy -> accuracy
I0523 15:43:36.627116 2712679360 net.cpp:122] Setting up accuracy
I0523 15:43:36.627120 2712679360 net.cpp:129] Top shape: (1)
I0523 15:43:36.627123 2712679360 net.cpp:137] Memory required for data: 31987604
I0523 15:43:36.627126 2712679360 layer_factory.hpp:77] Creating layer loss
I0523 15:43:36.627133 2712679360 net.cpp:84] Creating Layer loss
I0523 15:43:36.627169 2712679360 net.cpp:406] loss <- ip2_ip2_0_split_1
I0523 15:43:36.627178 2712679360 net.cpp:406] loss <- label_cifar_1_split_1
I0523 15:43:36.627183 2712679360 net.cpp:380] loss -> loss
I0523 15:43:36.627189 2712679360 layer_factory.hpp:77] Creating layer loss
I0523 15:43:36.627198 2712679360 net.cpp:122] Setting up loss
I0523 15:43:36.627202 2712679360 net.cpp:129] Top shape: (1)
I0523 15:43:36.627207 2712679360 net.cpp:132]     with loss weight 1
I0523 15:43:36.627213 2712679360 net.cpp:137] Memory required for data: 31987608
I0523 15:43:36.627215 2712679360 net.cpp:198] loss needs backward computation.
I0523 15:43:36.627219 2712679360 net.cpp:200] accuracy does not need backward computation.
I0523 15:43:36.627223 2712679360 net.cpp:198] ip2_ip2_0_split needs backward computation.
I0523 15:43:36.627228 2712679360 net.cpp:198] ip2 needs backward computation.
I0523 15:43:36.627230 2712679360 net.cpp:198] ip1 needs backward computation.
I0523 15:43:36.627234 2712679360 net.cpp:198] pool3 needs backward computation.
I0523 15:43:36.627321 2712679360 net.cpp:198] relu3 needs backward computation.
I0523 15:43:36.627334 2712679360 net.cpp:198] conv3 needs backward computation.
I0523 15:43:36.627341 2712679360 net.cpp:198] pool2 needs backward computation.
I0523 15:43:36.627348 2712679360 net.cpp:198] relu2 needs backward computation.
I0523 15:43:36.627354 2712679360 net.cpp:198] conv2 needs backward computation.
I0523 15:43:36.627387 2712679360 net.cpp:198] relu1 needs backward computation.
I0523 15:43:36.627394 2712679360 net.cpp:198] pool1 needs backward computation.
I0523 15:43:36.627400 2712679360 net.cpp:198] conv1 needs backward computation.
I0523 15:43:36.627409 2712679360 net.cpp:200] label_cifar_1_split does not need backward computation.
I0523 15:43:36.627418 2712679360 net.cpp:200] cifar does not need backward computation.
I0523 15:43:36.627432 2712679360 net.cpp:242] This network produces output accuracy
I0523 15:43:36.627454 2712679360 net.cpp:242] This network produces output loss
I0523 15:43:36.627470 2712679360 net.cpp:255] Network initialization done.
I0523 15:43:36.627553 2712679360 solver.cpp:56] Solver scaffolding done.
I0523 15:43:36.627593 2712679360 caffe.cpp:248] Starting Optimization
I0523 15:43:36.627602 2712679360 solver.cpp:272] Solving CIFAR10_quick
I0523 15:43:36.627610 2712679360 solver.cpp:273] Learning Rate Policy: fixed
I0523 15:43:36.627933 2712679360 solver.cpp:330] Iteration 0, Testing net (#0)
I0523 15:43:46.157997 1515520 data_layer.cpp:73] Restarting data prefetching from start.
I0523 15:43:46.542196 2712679360 solver.cpp:397]     Test net output #0: accuracy = 0.0865
I0523 15:43:46.542232 2712679360 solver.cpp:397]     Test net output #1: loss = 2.3025 (* 1 = 2.3025 loss)
I0523 15:43:46.784966 2712679360 solver.cpp:218] Iteration 0 (0 iter/s, 10.157s/100 iters), loss = 2.30202
I0523 15:43:46.785002 2712679360 solver.cpp:237]     Train net output #0: loss = 2.30202 (* 1 = 2.30202 loss)
I0523 15:43:46.785009 2712679360 sgd_solver.cpp:105] Iteration 0, lr = 0.001
I0523 15:44:08.112608 2712679360 solver.cpp:218] Iteration 100 (4.68889 iter/s, 21.327s/100 iters), loss = 1.67773
I0523 15:44:08.112664 2712679360 solver.cpp:237]     Train net output #0: loss = 1.67773 (* 1 = 1.67773 loss)
I0523 15:44:08.112673 2712679360 sgd_solver.cpp:105] Iteration 100, lr = 0.001
I0523 15:44:29.336644 2712679360 solver.cpp:218] Iteration 200 (4.71187 iter/s, 21.223s/100 iters), loss = 1.59886
I0523 15:44:29.336683 2712679360 solver.cpp:237]     Train net output #0: loss = 1.59886 (* 1 = 1.59886 loss)
I0523 15:44:29.336693 2712679360 sgd_solver.cpp:105] Iteration 200, lr = 0.001
I0523 15:44:50.573981 2712679360 solver.cpp:218] Iteration 300 (4.70876 iter/s, 21.237s/100 iters), loss = 1.31839
I0523 15:44:50.574038 2712679360 solver.cpp:237]     Train net output #0: loss = 1.31839 (* 1 = 1.31839 loss)
I0523 15:44:50.574044 2712679360 sgd_solver.cpp:105] Iteration 300, lr = 0.001
I0523 15:45:12.080576 2712679360 solver.cpp:218] Iteration 400 (4.64987 iter/s, 21.506s/100 iters), loss = 1.24876
I0523 15:45:12.080610 2712679360 solver.cpp:237]     Train net output #0: loss = 1.24876 (* 1 = 1.24876 loss)
I0523 15:45:12.080618 2712679360 sgd_solver.cpp:105] Iteration 400, lr = 0.001
I0523 15:45:32.450579 978944 data_layer.cpp:73] Restarting data prefetching from start.
I0523 15:45:33.342396 2712679360 solver.cpp:330] Iteration 500, Testing net (#0)
I0523 15:45:42.732501 1515520 data_layer.cpp:73] Restarting data prefetching from start.
I0523 15:45:43.134589 2712679360 solver.cpp:397]     Test net output #0: accuracy = 0.5366
I0523 15:45:43.134620 2712679360 solver.cpp:397]     Test net output #1: loss = 1.31952 (* 1 = 1.31952 loss)
I0523 15:45:43.360550 2712679360 solver.cpp:218] Iteration 500 (3.19703 iter/s, 31.279s/100 iters), loss = 1.22391
I0523 15:45:43.360582 2712679360 solver.cpp:237]     Train net output #0: loss = 1.22391 (* 1 = 1.22391 loss)
I0523 15:45:43.360589 2712679360 sgd_solver.cpp:105] Iteration 500, lr = 0.001
I0523 15:46:06.734716 2712679360 solver.cpp:218] Iteration 600 (4.27826 iter/s, 23.374s/100 iters), loss = 1.23177
I0523 15:46:06.734771 2712679360 solver.cpp:237]     Train net output #0: loss = 1.23177 (* 1 = 1.23177 loss)
I0523 15:46:06.734779 2712679360 sgd_solver.cpp:105] Iteration 600, lr = 0.001
.......数据形式基本相同 故省略...
I0523 16:00:46.286926 2712679360 solver.cpp:218] Iteration 3900 (4.08731 iter/s, 24.466s/100 iters), loss = 0.557826
I0523 16:00:46.286960 2712679360 solver.cpp:237]     Train net output #0: loss = 0.557826 (* 1 = 0.557826 loss)
I0523 16:00:46.286967 2712679360 sgd_solver.cpp:105] Iteration 3900, lr = 0.001
I0523 16:01:09.469552 978944 data_layer.cpp:73] Restarting data prefetching from start.
I0523 16:01:10.472170 2712679360 solver.cpp:447] Snapshotting to binary proto file examples/cifar10/cifar10_quick_iter_4000.caffemodel
I0523 16:01:10.475755 2712679360 sgd_solver.cpp:273] Snapshotting solver state to binary proto file examples/cifar10/cifar10_quick_iter_4000.solverstate
I0523 16:01:10.590515 2712679360 solver.cpp:310] Iteration 4000, loss = 0.641508
I0523 16:01:10.590548 2712679360 solver.cpp:330] Iteration 4000, Testing net (#0)
I0523 16:01:21.619536 1515520 data_layer.cpp:73] Restarting data prefetching from start.
I0523 16:01:22.054498 2712679360 solver.cpp:397]     Test net output #0: accuracy = 0.7119
I0523 16:01:22.054538 2712679360 solver.cpp:397]     Test net output #1: loss = 0.848064 (* 1 = 0.848064 loss)
I0523 16:01:22.054548 2712679360 solver.cpp:315] Optimization Done.
I0523 16:01:22.054555 2712679360 caffe.cpp:259] Optimization Done.
I0523 16:01:22.119184 2712679360 caffe.cpp:211] Use CPU.
I0523 16:01:22.120214 2712679360 solver.cpp:44] Initializing solver from parameters:
test_iter: 100
test_interval: 500
base_lr: 0.0001
display: 100
max_iter: 5000
lr_policy: "fixed"
momentum: 0.9
weight_decay: 0.004
snapshot: 5000
snapshot_prefix: "examples/cifar10/cifar10_quick"
solver_mode: CPU
net: "examples/cifar10/cifar10_quick_train_test.prototxt"
train_state {
  level: 0
  stage: ""
}
snapshot_format: HDF5
I0523 16:01:22.120556 2712679360 solver.cpp:87] Creating training net from net file: examples/cifar10/cifar10_quick_train_test.prototxt
I0523 16:01:22.120817 2712679360 net.cpp:294] The NetState phase (0) differed from the phase (1) specified by a rule in layer cifar
I0523 16:01:22.120833 2712679360 net.cpp:294] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy
I0523 16:01:22.120841 2712679360 net.cpp:51] Initializing net from parameters:
name: "CIFAR10_quick"
state {
  phase: TRAIN
  level: 0
  stage: ""
}
layer {
  name: "cifar"
  type: "Data"
  top: "data"
  top: "label"
  include {
    phase: TRAIN
  }
  transform_param {
    mean_file: "examples/cifar10/mean.binaryproto"
  }
  data_param {
    source: "examples/cifar10/cifar10_train_lmdb"
    batch_size: 100
    backend: LMDB
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.0001
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "pool1"
  top: "pool1"
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 64
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "pool3"
  type: "Pooling"
  bottom: "conv3"
  top: "pool3"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "ip1"
  type: "InnerProduct"
  bottom: "pool3"
  top: "ip1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 64
    weight_filler {
      type: "gaussian"
      std: 0.1
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "ip2"
  type: "InnerProduct"
  bottom: "ip1"
  top: "ip2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 10
    weight_filler {
      type: "gaussian"
      std: 0.1
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip2"
  bottom: "label"
  top: "loss"
}
I0523 16:01:22.121104 2712679360 layer_factory.hpp:77] Creating layer cifar
I0523 16:01:22.121320 2712679360 db_lmdb.cpp:35] Opened lmdb examples/cifar10/cifar10_train_lmdb
I0523 16:01:22.121383 2712679360 net.cpp:84] Creating Layer cifar
I0523 16:01:22.121393 2712679360 net.cpp:380] cifar -> data
I0523 16:01:22.121413 2712679360 net.cpp:380] cifar -> label
I0523 16:01:22.121431 2712679360 data_transformer.cpp:25] Loading mean file from: examples/cifar10/mean.binaryproto
I0523 16:01:22.121585 2712679360 data_layer.cpp:45] output data size: 100,3,32,32
I0523 16:01:22.128842 2712679360 net.cpp:122] Setting up cifar
I0523 16:01:22.128867 2712679360 net.cpp:129] Top shape: 100 3 32 32 (307200)
I0523 16:01:22.128875 2712679360 net.cpp:129] Top shape: 100 (100)
I0523 16:01:22.128880 2712679360 net.cpp:137] Memory required for data: 1229200
I0523 16:01:22.128890 2712679360 layer_factory.hpp:77] Creating layer conv1
I0523 16:01:22.128902 2712679360 net.cpp:84] Creating Layer conv1
I0523 16:01:22.128907 2712679360 net.cpp:406] conv1 <- data
I0523 16:01:22.128914 2712679360 net.cpp:380] conv1 -> conv1
I0523 16:01:22.129009 2712679360 net.cpp:122] Setting up conv1
I0523 16:01:22.129017 2712679360 net.cpp:129] Top shape: 100 32 32 32 (3276800)
I0523 16:01:22.129022 2712679360 net.cpp:137] Memory required for data: 14336400
I0523 16:01:22.129030 2712679360 layer_factory.hpp:77] Creating layer pool1
I0523 16:01:22.129039 2712679360 net.cpp:84] Creating Layer pool1
I0523 16:01:22.129042 2712679360 net.cpp:406] pool1 <- conv1
I0523 16:01:22.129047 2712679360 net.cpp:380] pool1 -> pool1
I0523 16:01:22.129057 2712679360 net.cpp:122] Setting up pool1
I0523 16:01:22.129062 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 16:01:22.129067 2712679360 net.cpp:137] Memory required for data: 17613200
I0523 16:01:22.129071 2712679360 layer_factory.hpp:77] Creating layer relu1
I0523 16:01:22.129078 2712679360 net.cpp:84] Creating Layer relu1
I0523 16:01:22.129083 2712679360 net.cpp:406] relu1 <- pool1
I0523 16:01:22.129087 2712679360 net.cpp:367] relu1 -> pool1 (in-place)
I0523 16:01:22.129093 2712679360 net.cpp:122] Setting up relu1
I0523 16:01:22.129097 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 16:01:22.129102 2712679360 net.cpp:137] Memory required for data: 20890000
I0523 16:01:22.129106 2712679360 layer_factory.hpp:77] Creating layer conv2
I0523 16:01:22.129117 2712679360 net.cpp:84] Creating Layer conv2
I0523 16:01:22.129120 2712679360 net.cpp:406] conv2 <- pool1
I0523 16:01:22.129125 2712679360 net.cpp:380] conv2 -> conv2
I0523 16:01:22.129482 2712679360 net.cpp:122] Setting up conv2
I0523 16:01:22.129487 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 16:01:22.129493 2712679360 net.cpp:137] Memory required for data: 24166800
I0523 16:01:22.129500 2712679360 layer_factory.hpp:77] Creating layer relu2
I0523 16:01:22.129505 2712679360 net.cpp:84] Creating Layer relu2
I0523 16:01:22.129509 2712679360 net.cpp:406] relu2 <- conv2
I0523 16:01:22.129514 2712679360 net.cpp:367] relu2 -> conv2 (in-place)
I0523 16:01:22.129520 2712679360 net.cpp:122] Setting up relu2
I0523 16:01:22.129524 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 16:01:22.129528 2712679360 net.cpp:137] Memory required for data: 27443600
I0523 16:01:22.129534 2712679360 layer_factory.hpp:77] Creating layer pool2
I0523 16:01:22.129537 2712679360 net.cpp:84] Creating Layer pool2
I0523 16:01:22.129541 2712679360 net.cpp:406] pool2 <- conv2
I0523 16:01:22.129547 2712679360 net.cpp:380] pool2 -> pool2
I0523 16:01:22.129554 2712679360 net.cpp:122] Setting up pool2
I0523 16:01:22.129557 2712679360 net.cpp:129] Top shape: 100 32 8 8 (204800)
I0523 16:01:22.129562 2712679360 net.cpp:137] Memory required for data: 28262800
I0523 16:01:22.129566 2712679360 layer_factory.hpp:77] Creating layer conv3
I0523 16:01:22.129573 2712679360 net.cpp:84] Creating Layer conv3
I0523 16:01:22.129577 2712679360 net.cpp:406] conv3 <- pool2
I0523 16:01:22.129585 2712679360 net.cpp:380] conv3 -> conv3
I0523 16:01:22.130280 2712679360 net.cpp:122] Setting up conv3
I0523 16:01:22.130286 2712679360 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0523 16:01:22.130292 2712679360 net.cpp:137] Memory required for data: 29901200
I0523 16:01:22.130298 2712679360 layer_factory.hpp:77] Creating layer relu3
I0523 16:01:22.130304 2712679360 net.cpp:84] Creating Layer relu3
I0523 16:01:22.130308 2712679360 net.cpp:406] relu3 <- conv3
I0523 16:01:22.130313 2712679360 net.cpp:367] relu3 -> conv3 (in-place)
I0523 16:01:22.130318 2712679360 net.cpp:122] Setting up relu3
I0523 16:01:22.130353 2712679360 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0523 16:01:22.130360 2712679360 net.cpp:137] Memory required for data: 31539600
I0523 16:01:22.130364 2712679360 layer_factory.hpp:77] Creating layer pool3
I0523 16:01:22.130370 2712679360 net.cpp:84] Creating Layer pool3
I0523 16:01:22.130374 2712679360 net.cpp:406] pool3 <- conv3
I0523 16:01:22.130379 2712679360 net.cpp:380] pool3 -> pool3
I0523 16:01:22.130385 2712679360 net.cpp:122] Setting up pool3
I0523 16:01:22.130389 2712679360 net.cpp:129] Top shape: 100 64 4 4 (102400)
I0523 16:01:22.130396 2712679360 net.cpp:137] Memory required for data: 31949200
I0523 16:01:22.130400 2712679360 layer_factory.hpp:77] Creating layer ip1
I0523 16:01:22.130409 2712679360 net.cpp:84] Creating Layer ip1
I0523 16:01:22.130414 2712679360 net.cpp:406] ip1 <- pool3
I0523 16:01:22.130419 2712679360 net.cpp:380] ip1 -> ip1
I0523 16:01:22.131337 2712679360 net.cpp:122] Setting up ip1
I0523 16:01:22.131347 2712679360 net.cpp:129] Top shape: 100 64 (6400)
I0523 16:01:22.131352 2712679360 net.cpp:137] Memory required for data: 31974800
I0523 16:01:22.131358 2712679360 layer_factory.hpp:77] Creating layer ip2
I0523 16:01:22.131364 2712679360 net.cpp:84] Creating Layer ip2
I0523 16:01:22.131369 2712679360 net.cpp:406] ip2 <- ip1
I0523 16:01:22.131374 2712679360 net.cpp:380] ip2 -> ip2
I0523 16:01:22.131392 2712679360 net.cpp:122] Setting up ip2
I0523 16:01:22.131397 2712679360 net.cpp:129] Top shape: 100 10 (1000)
I0523 16:01:22.131400 2712679360 net.cpp:137] Memory required for data: 31978800
I0523 16:01:22.131407 2712679360 layer_factory.hpp:77] Creating layer loss
I0523 16:01:22.131413 2712679360 net.cpp:84] Creating Layer loss
I0523 16:01:22.131417 2712679360 net.cpp:406] loss <- ip2
I0523 16:01:22.131422 2712679360 net.cpp:406] loss <- label
I0523 16:01:22.131427 2712679360 net.cpp:380] loss -> loss
I0523 16:01:22.131435 2712679360 layer_factory.hpp:77] Creating layer loss
I0523 16:01:22.131448 2712679360 net.cpp:122] Setting up loss
I0523 16:01:22.131453 2712679360 net.cpp:129] Top shape: (1)
I0523 16:01:22.131458 2712679360 net.cpp:132]     with loss weight 1
I0523 16:01:22.131471 2712679360 net.cpp:137] Memory required for data: 31978804
I0523 16:01:22.131476 2712679360 net.cpp:198] loss needs backward computation.
I0523 16:01:22.131495 2712679360 net.cpp:198] ip2 needs backward computation.
I0523 16:01:22.131505 2712679360 net.cpp:198] ip1 needs backward computation.
I0523 16:01:22.131510 2712679360 net.cpp:198] pool3 needs backward computation.
I0523 16:01:22.131515 2712679360 net.cpp:198] relu3 needs backward computation.
I0523 16:01:22.131518 2712679360 net.cpp:198] conv3 needs backward computation.
I0523 16:01:22.131522 2712679360 net.cpp:198] pool2 needs backward computation.
I0523 16:01:22.131527 2712679360 net.cpp:198] relu2 needs backward computation.
I0523 16:01:22.131531 2712679360 net.cpp:198] conv2 needs backward computation.
I0523 16:01:22.131536 2712679360 net.cpp:198] relu1 needs backward computation.
I0523 16:01:22.131541 2712679360 net.cpp:198] pool1 needs backward computation.
I0523 16:01:22.131544 2712679360 net.cpp:198] conv1 needs backward computation.
I0523 16:01:22.131548 2712679360 net.cpp:200] cifar does not need backward computation.
I0523 16:01:22.131552 2712679360 net.cpp:242] This network produces output loss
I0523 16:01:22.131561 2712679360 net.cpp:255] Network initialization done.
I0523 16:01:22.131786 2712679360 solver.cpp:172] Creating test net (#0) specified by net file: examples/cifar10/cifar10_quick_train_test.prototxt
I0523 16:01:22.131814 2712679360 net.cpp:294] The NetState phase (1) differed from the phase (0) specified by a rule in layer cifar
I0523 16:01:22.131826 2712679360 net.cpp:51] Initializing net from parameters:
name: "CIFAR10_quick"
state {
  phase: TEST
}
layer {
  name: "cifar"
  type: "Data"
  top: "data"
  top: "label"
  include {
    phase: TEST
  }
  transform_param {
    mean_file: "examples/cifar10/mean.binaryproto"
  }
  data_param {
    source: "examples/cifar10/cifar10_test_lmdb"
    batch_size: 100
    backend: LMDB
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.0001
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "pool1"
  top: "pool1"
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 64
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "pool3"
  type: "Pooling"
  bottom: "conv3"
  top: "pool3"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "ip1"
  type: "InnerProduct"
  bottom: "pool3"
  top: "ip1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 64
    weight_filler {
      type: "gaussian"
      std: 0.1
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "ip2"
  type: "InnerProduct"
  bottom: "ip1"
  top: "ip2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 10
    weight_filler {
      type: "gaussian"
      std: 0.1
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "accuracy"
  type: "Accuracy"
  bottom: "ip2"
  bottom: "label"
  top: "accuracy"
  include {
    phase: TEST
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip2"
  bottom: "label"
  top: "loss"
}
I0523 16:01:22.132225 2712679360 layer_factory.hpp:77] Creating layer cifar
I0523 16:01:22.132313 2712679360 db_lmdb.cpp:35] Opened lmdb examples/cifar10/cifar10_test_lmdb
I0523 16:01:22.132342 2712679360 net.cpp:84] Creating Layer cifar
I0523 16:01:22.132356 2712679360 net.cpp:380] cifar -> data
I0523 16:01:22.132364 2712679360 net.cpp:380] cifar -> label
I0523 16:01:22.132372 2712679360 data_transformer.cpp:25] Loading mean file from: examples/cifar10/mean.binaryproto
I0523 16:01:22.132438 2712679360 data_layer.cpp:45] output data size: 100,3,32,32
I0523 16:01:22.134943 2712679360 net.cpp:122] Setting up cifar
I0523 16:01:22.134956 2712679360 net.cpp:129] Top shape: 100 3 32 32 (307200)
I0523 16:01:22.134963 2712679360 net.cpp:129] Top shape: 100 (100)
I0523 16:01:22.134968 2712679360 net.cpp:137] Memory required for data: 1229200
I0523 16:01:22.134974 2712679360 layer_factory.hpp:77] Creating layer label_cifar_1_split
I0523 16:01:22.134984 2712679360 net.cpp:84] Creating Layer label_cifar_1_split
I0523 16:01:22.135015 2712679360 net.cpp:406] label_cifar_1_split <- label
I0523 16:01:22.135064 2712679360 net.cpp:380] label_cifar_1_split -> label_cifar_1_split_0
I0523 16:01:22.135078 2712679360 net.cpp:380] label_cifar_1_split -> label_cifar_1_split_1
I0523 16:01:22.135116 2712679360 net.cpp:122] Setting up label_cifar_1_split
I0523 16:01:22.135167 2712679360 net.cpp:129] Top shape: 100 (100)
I0523 16:01:22.135203 2712679360 net.cpp:129] Top shape: 100 (100)
I0523 16:01:22.135241 2712679360 net.cpp:137] Memory required for data: 1230000
I0523 16:01:22.135313 2712679360 layer_factory.hpp:77] Creating layer conv1
I0523 16:01:22.135330 2712679360 net.cpp:84] Creating Layer conv1
I0523 16:01:22.135335 2712679360 net.cpp:406] conv1 <- data
I0523 16:01:22.135342 2712679360 net.cpp:380] conv1 -> conv1
I0523 16:01:22.135398 2712679360 net.cpp:122] Setting up conv1
I0523 16:01:22.135404 2712679360 net.cpp:129] Top shape: 100 32 32 32 (3276800)
I0523 16:01:22.135411 2712679360 net.cpp:137] Memory required for data: 14337200
I0523 16:01:22.135418 2712679360 layer_factory.hpp:77] Creating layer pool1
I0523 16:01:22.135463 2712679360 net.cpp:84] Creating Layer pool1
I0523 16:01:22.135473 2712679360 net.cpp:406] pool1 <- conv1
I0523 16:01:22.135514 2712679360 net.cpp:380] pool1 -> pool1
I0523 16:01:22.135565 2712679360 net.cpp:122] Setting up pool1
I0523 16:01:22.135574 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 16:01:22.135581 2712679360 net.cpp:137] Memory required for data: 17614000
I0523 16:01:22.135586 2712679360 layer_factory.hpp:77] Creating layer relu1
I0523 16:01:22.135593 2712679360 net.cpp:84] Creating Layer relu1
I0523 16:01:22.135598 2712679360 net.cpp:406] relu1 <- pool1
I0523 16:01:22.135603 2712679360 net.cpp:367] relu1 -> pool1 (in-place)
I0523 16:01:22.135609 2712679360 net.cpp:122] Setting up relu1
I0523 16:01:22.135613 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 16:01:22.135666 2712679360 net.cpp:137] Memory required for data: 20890800
I0523 16:01:22.135673 2712679360 layer_factory.hpp:77] Creating layer conv2
I0523 16:01:22.135681 2712679360 net.cpp:84] Creating Layer conv2
I0523 16:01:22.135686 2712679360 net.cpp:406] conv2 <- pool1
I0523 16:01:22.135700 2712679360 net.cpp:380] conv2 -> conv2
I0523 16:01:22.136068 2712679360 net.cpp:122] Setting up conv2
I0523 16:01:22.136076 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 16:01:22.136081 2712679360 net.cpp:137] Memory required for data: 24167600
I0523 16:01:22.136088 2712679360 layer_factory.hpp:77] Creating layer relu2
I0523 16:01:22.136095 2712679360 net.cpp:84] Creating Layer relu2
I0523 16:01:22.136098 2712679360 net.cpp:406] relu2 <- conv2
I0523 16:01:22.136103 2712679360 net.cpp:367] relu2 -> conv2 (in-place)
I0523 16:01:22.136108 2712679360 net.cpp:122] Setting up relu2
I0523 16:01:22.136112 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 16:01:22.136117 2712679360 net.cpp:137] Memory required for data: 27444400
I0523 16:01:22.136121 2712679360 layer_factory.hpp:77] Creating layer pool2
I0523 16:01:22.136127 2712679360 net.cpp:84] Creating Layer pool2
I0523 16:01:22.136132 2712679360 net.cpp:406] pool2 <- conv2
I0523 16:01:22.136135 2712679360 net.cpp:380] pool2 -> pool2
I0523 16:01:22.136142 2712679360 net.cpp:122] Setting up pool2
I0523 16:01:22.136147 2712679360 net.cpp:129] Top shape: 100 32 8 8 (204800)
I0523 16:01:22.136152 2712679360 net.cpp:137] Memory required for data: 28263600
I0523 16:01:22.136157 2712679360 layer_factory.hpp:77] Creating layer conv3
I0523 16:01:22.136163 2712679360 net.cpp:84] Creating Layer conv3
I0523 16:01:22.136168 2712679360 net.cpp:406] conv3 <- pool2
I0523 16:01:22.136173 2712679360 net.cpp:380] conv3 -> conv3
I0523 16:01:22.136878 2712679360 net.cpp:122] Setting up conv3
I0523 16:01:22.136888 2712679360 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0523 16:01:22.136893 2712679360 net.cpp:137] Memory required for data: 29902000
I0523 16:01:22.136899 2712679360 layer_factory.hpp:77] Creating layer relu3
I0523 16:01:22.136904 2712679360 net.cpp:84] Creating Layer relu3
I0523 16:01:22.136909 2712679360 net.cpp:406] relu3 <- conv3
I0523 16:01:22.136914 2712679360 net.cpp:367] relu3 -> conv3 (in-place)
I0523 16:01:22.136919 2712679360 net.cpp:122] Setting up relu3
I0523 16:01:22.136930 2712679360 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0523 16:01:22.136961 2712679360 net.cpp:137] Memory required for data: 31540400
I0523 16:01:22.136968 2712679360 layer_factory.hpp:77] Creating layer pool3
I0523 16:01:22.136976 2712679360 net.cpp:84] Creating Layer pool3
I0523 16:01:22.137001 2712679360 net.cpp:406] pool3 <- conv3
I0523 16:01:22.137008 2712679360 net.cpp:380] pool3 -> pool3
I0523 16:01:22.137017 2712679360 net.cpp:122] Setting up pool3
I0523 16:01:22.137022 2712679360 net.cpp:129] Top shape: 100 64 4 4 (102400)
I0523 16:01:22.137027 2712679360 net.cpp:137] Memory required for data: 31950000
I0523 16:01:22.137032 2712679360 layer_factory.hpp:77] Creating layer ip1
I0523 16:01:22.137039 2712679360 net.cpp:84] Creating Layer ip1
I0523 16:01:22.137044 2712679360 net.cpp:406] ip1 <- pool3
I0523 16:01:22.137050 2712679360 net.cpp:380] ip1 -> ip1
I0523 16:01:22.137981 2712679360 net.cpp:122] Setting up ip1
I0523 16:01:22.137995 2712679360 net.cpp:129] Top shape: 100 64 (6400)
I0523 16:01:22.138002 2712679360 net.cpp:137] Memory required for data: 31975600
I0523 16:01:22.138008 2712679360 layer_factory.hpp:77] Creating layer ip2
I0523 16:01:22.138016 2712679360 net.cpp:84] Creating Layer ip2
I0523 16:01:22.138021 2712679360 net.cpp:406] ip2 <- ip1
I0523 16:01:22.138027 2712679360 net.cpp:380] ip2 -> ip2
I0523 16:01:22.138046 2712679360 net.cpp:122] Setting up ip2
I0523 16:01:22.138051 2712679360 net.cpp:129] Top shape: 100 10 (1000)
I0523 16:01:22.138056 2712679360 net.cpp:137] Memory required for data: 31979600
I0523 16:01:22.138062 2712679360 layer_factory.hpp:77] Creating layer ip2_ip2_0_split
I0523 16:01:22.138085 2712679360 net.cpp:84] Creating Layer ip2_ip2_0_split
I0523 16:01:22.138103 2712679360 net.cpp:406] ip2_ip2_0_split <- ip2
I0523 16:01:22.138115 2712679360 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_0
I0523 16:01:22.138129 2712679360 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_1
I0523 16:01:22.138142 2712679360 net.cpp:122] Setting up ip2_ip2_0_split
I0523 16:01:22.138150 2712679360 net.cpp:129] Top shape: 100 10 (1000)
I0523 16:01:22.138160 2712679360 net.cpp:129] Top shape: 100 10 (1000)
I0523 16:01:22.138170 2712679360 net.cpp:137] Memory required for data: 31987600
I0523 16:01:22.138177 2712679360 layer_factory.hpp:77] Creating layer accuracy
I0523 16:01:22.138187 2712679360 net.cpp:84] Creating Layer accuracy
I0523 16:01:22.138219 2712679360 net.cpp:406] accuracy <- ip2_ip2_0_split_0
I0523 16:01:22.138231 2712679360 net.cpp:406] accuracy <- label_cifar_1_split_0
I0523 16:01:22.138242 2712679360 net.cpp:380] accuracy -> accuracy
I0523 16:01:22.138257 2712679360 net.cpp:122] Setting up accuracy
I0523 16:01:22.138264 2712679360 net.cpp:129] Top shape: (1)
I0523 16:01:22.138274 2712679360 net.cpp:137] Memory required for data: 31987604
I0523 16:01:22.138279 2712679360 layer_factory.hpp:77] Creating layer loss
I0523 16:01:22.138286 2712679360 net.cpp:84] Creating Layer loss
I0523 16:01:22.138290 2712679360 net.cpp:406] loss <- ip2_ip2_0_split_1
I0523 16:01:22.138327 2712679360 net.cpp:406] loss <- label_cifar_1_split_1
I0523 16:01:22.138334 2712679360 net.cpp:380] loss -> loss
I0523 16:01:22.138342 2712679360 layer_factory.hpp:77] Creating layer loss
I0523 16:01:22.138352 2712679360 net.cpp:122] Setting up loss
I0523 16:01:22.138357 2712679360 net.cpp:129] Top shape: (1)
I0523 16:01:22.138362 2712679360 net.cpp:132]     with loss weight 1
I0523 16:01:22.138368 2712679360 net.cpp:137] Memory required for data: 31987608
I0523 16:01:22.138372 2712679360 net.cpp:198] loss needs backward computation.
I0523 16:01:22.138377 2712679360 net.cpp:200] accuracy does not need backward computation.
I0523 16:01:22.138382 2712679360 net.cpp:198] ip2_ip2_0_split needs backward computation.
I0523 16:01:22.138386 2712679360 net.cpp:198] ip2 needs backward computation.
I0523 16:01:22.138391 2712679360 net.cpp:198] ip1 needs backward computation.
I0523 16:01:22.138396 2712679360 net.cpp:198] pool3 needs backward computation.
I0523 16:01:22.138401 2712679360 net.cpp:198] relu3 needs backward computation.
I0523 16:01:22.138404 2712679360 net.cpp:198] conv3 needs backward computation.
I0523 16:01:22.138408 2712679360 net.cpp:198] pool2 needs backward computation.
I0523 16:01:22.138412 2712679360 net.cpp:198] relu2 needs backward computation.
I0523 16:01:22.138417 2712679360 net.cpp:198] conv2 needs backward computation.
I0523 16:01:22.138444 2712679360 net.cpp:198] relu1 needs backward computation.
I0523 16:01:22.138449 2712679360 net.cpp:198] pool1 needs backward computation.
I0523 16:01:22.138454 2712679360 net.cpp:198] conv1 needs backward computation.
I0523 16:01:22.138463 2712679360 net.cpp:200] label_cifar_1_split does not need backward computation.
I0523 16:01:22.138468 2712679360 net.cpp:200] cifar does not need backward computation.
I0523 16:01:22.138470 2712679360 net.cpp:242] This network produces output accuracy
I0523 16:01:22.138476 2712679360 net.cpp:242] This network produces output loss
I0523 16:01:22.138485 2712679360 net.cpp:255] Network initialization done.
I0523 16:01:22.138537 2712679360 solver.cpp:56] Solver scaffolding done.
I0523 16:01:22.138566 2712679360 caffe.cpp:242] Resuming from examples/cifar10/cifar10_quick_iter_4000.solverstate
I0523 16:01:22.139786 2712679360 sgd_solver.cpp:318] SGDSolver: restoring history
I0523 16:01:22.140019 2712679360 caffe.cpp:248] Starting Optimization
I0523 16:01:22.140027 2712679360 solver.cpp:272] Solving CIFAR10_quick
I0523 16:01:22.140031 2712679360 solver.cpp:273] Learning Rate Policy: fixed
I0523 16:01:22.140113 2712679360 solver.cpp:330] Iteration 4000, Testing net (#0)
I0523 16:01:32.383680 215015424 data_layer.cpp:73] Restarting data prefetching from start.
I0523 16:01:32.807214 2712679360 solver.cpp:397]     Test net output #0: accuracy = 0.7119
I0523 16:01:32.807250 2712679360 solver.cpp:397]     Test net output #1: loss = 0.848064 (* 1 = 0.848064 loss)
I0523 16:01:33.065510 2712679360 solver.cpp:218] Iteration 4000 (366.133 iter/s, 10.925s/100 iters), loss = 0.641508
I0523 16:01:33.065546 2712679360 solver.cpp:237]     Train net output #0: loss = 0.641508 (* 1 = 0.641508 loss)
I0523 16:01:33.065553 2712679360 sgd_solver.cpp:105] Iteration 4000, lr = 0.0001
I0523 16:01:56.950950 2712679360 solver.cpp:218] Iteration 4100 (4.18673 iter/s, 23.885s/100 iters), loss = 0.603556
I0523 16:01:56.951002 2712679360 solver.cpp:237]     Train net output #0: loss = 0.603556 (* 1 = 0.603556 loss)
I0523 16:01:56.951010 2712679360 sgd_solver.cpp:105] Iteration 4100, lr = 0.0001
I0523 16:02:21.127391 2712679360 solver.cpp:218] Iteration 4200 (4.13633 iter/s, 24.176s/100 iters), loss = 0.491505
I0523 16:02:21.127429 2712679360 solver.cpp:237]     Train net output #0: loss = 0.491505 (* 1 = 0.491505 loss)
I0523 16:02:21.127437 2712679360 sgd_solver.cpp:105] Iteration 4200, lr = 0.0001
I0523 16:02:46.283135 2712679360 solver.cpp:218] Iteration 4300 (3.97535 iter/s, 25.155s/100 iters), loss = 0.495313
I0523 16:02:46.283190 2712679360 solver.cpp:237]     Train net output #0: loss = 0.495313 (* 1 = 0.495313 loss)
I0523 16:02:46.283198 2712679360 sgd_solver.cpp:105] Iteration 4300, lr = 0.0001
I0523 16:03:10.841265 2712679360 solver.cpp:218] Iteration 4400 (4.07199 iter/s, 24.558s/100 iters), loss = 0.438567
I0523 16:03:10.841303 2712679360 solver.cpp:237]     Train net output #0: loss = 0.438567 (* 1 = 0.438567 loss)
I0523 16:03:10.841310 2712679360 sgd_solver.cpp:105] Iteration 4400, lr = 0.0001
I0523 16:03:33.942627 214478848 data_layer.cpp:73] Restarting data prefetching from start.
I0523 16:03:34.958622 2712679360 solver.cpp:330] Iteration 4500, Testing net (#0)
I0523 16:03:45.910739 215015424 data_layer.cpp:73] Restarting data prefetching from start.
I0523 16:03:46.349741 2712679360 solver.cpp:397]     Test net output #0: accuracy = 0.752
I0523 16:03:46.349779 2712679360 solver.cpp:397]     Test net output #1: loss = 0.748076 (* 1 = 0.748076 loss)
I0523 16:03:46.589071 2712679360 solver.cpp:218] Iteration 4500 (2.79744 iter/s, 35.747s/100 iters), loss = 0.503921
I0523 16:03:46.589107 2712679360 solver.cpp:237]     Train net output #0: loss = 0.503921 (* 1 = 0.503921 loss)
I0523 16:03:46.589113 2712679360 sgd_solver.cpp:105] Iteration 4500, lr = 0.0001
I0523 16:04:10.851019 2712679360 solver.cpp:218] Iteration 4600 (4.12184 iter/s, 24.261s/100 iters), loss = 0.562534
I0523 16:04:10.851088 2712679360 solver.cpp:237]     Train net output #0: loss = 0.562534 (* 1 = 0.562534 loss)
I0523 16:04:10.851095 2712679360 sgd_solver.cpp:105] Iteration 4600, lr = 0.0001
I0523 16:04:35.547813 2712679360 solver.cpp:218] Iteration 4700 (4.04924 iter/s, 24.696s/100 iters), loss = 0.464102
I0523 16:04:35.547852 2712679360 solver.cpp:237]     Train net output #0: loss = 0.464102 (* 1 = 0.464102 loss)
I0523 16:04:35.547860 2712679360 sgd_solver.cpp:105] Iteration 4700, lr = 0.0001
I0523 16:05:00.517423 2712679360 solver.cpp:218] Iteration 4800 (4.00497 iter/s, 24.969s/100 iters), loss = 0.474584
I0523 16:05:00.517478 2712679360 solver.cpp:237]     Train net output #0: loss = 0.474584 (* 1 = 0.474584 loss)
I0523 16:05:00.517487 2712679360 sgd_solver.cpp:105] Iteration 4800, lr = 0.0001
I0523 16:05:24.429520 2712679360 solver.cpp:218] Iteration 4900 (4.182 iter/s, 23.912s/100 iters), loss = 0.417258
I0523 16:05:24.429554 2712679360 solver.cpp:237]     Train net output #0: loss = 0.417258 (* 1 = 0.417258 loss)
I0523 16:05:24.429563 2712679360 sgd_solver.cpp:105] Iteration 4900, lr = 0.0001
I0523 16:05:47.148733 214478848 data_layer.cpp:73] Restarting data prefetching from start.
I0523 16:05:48.086921 2712679360 solver.cpp:457] Snapshotting to HDF5 file examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5
I0523 16:05:48.101351 2712679360 sgd_solver.cpp:283] Snapshotting solver state to HDF5 file examples/cifar10/cifar10_quick_iter_5000.solverstate.h5
I0523 16:05:48.215885 2712679360 solver.cpp:310] Iteration 5000, loss = 0.487594
I0523 16:05:48.215921 2712679360 solver.cpp:330] Iteration 5000, Testing net (#0)
I0523 16:05:58.710295 215015424 data_layer.cpp:73] Restarting data prefetching from start.
I0523 16:05:59.149840 2712679360 solver.cpp:397]     Test net output #0: accuracy = 0.754
I0523 16:05:59.149875 2712679360 solver.cpp:397]     Test net output #1: loss = 0.742307 (* 1 = 0.742307 loss)
I0523 16:05:59.149883 2712679360 solver.cpp:315] Optimization Done.
I0523 16:05:59.149888 2712679360 caffe.cpp:259] Optimization Done.

训练完毕.并且在最后已经创建好了测试网络.

下面我们用训练好的cifar10模型来对数据进行预测:

运行如下命令:

➜  caffe git:(master) ✗ ./build/tools/caffe.bin test \
-model examples/cifar10/cifar10_quick_train_test.prototxt \
-weights examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5 \
-iterations 100

对测试数据集进行预测:

I0523 16:25:41.234220 2712679360 caffe.cpp:284] Use CPU.
I0523 16:25:41.238044 2712679360 net.cpp:294] The NetState phase (1) differed from the phase (0) specified by a rule in layer cifar
I0523 16:25:41.238080 2712679360 net.cpp:51] Initializing net from parameters:
name: "CIFAR10_quick"
state {
  phase: TEST
  level: 0
  stage: ""
}
layer {
  name: "cifar"
  type: "Data"
  top: "data"
  top: "label"
  include {
    phase: TEST
  }
  transform_param {
    mean_file: "examples/cifar10/mean.binaryproto"
  }
  data_param {
    source: "examples/cifar10/cifar10_test_lmdb"
    batch_size: 100
    backend: LMDB
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.0001
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "pool1"
  top: "pool1"
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 64
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "pool3"
  type: "Pooling"
  bottom: "conv3"
  top: "pool3"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "ip1"
  type: "InnerProduct"
  bottom: "pool3"
  top: "ip1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 64
    weight_filler {
      type: "gaussian"
      std: 0.1
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "ip2"
  type: "InnerProduct"
  bottom: "ip1"
  top: "ip2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 10
    weight_filler {
      type: "gaussian"
      std: 0.1
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "accuracy"
  type: "Accuracy"
  bottom: "ip2"
  bottom: "label"
  top: "accuracy"
  include {
    phase: TEST
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip2"
  bottom: "label"
  top: "loss"
}
I0523 16:25:41.238523 2712679360 layer_factory.hpp:77] Creating layer cifar
I0523 16:25:41.238731 2712679360 db_lmdb.cpp:35] Opened lmdb examples/cifar10/cifar10_test_lmdb
I0523 16:25:41.238788 2712679360 net.cpp:84] Creating Layer cifar
I0523 16:25:41.238796 2712679360 net.cpp:380] cifar -> data
I0523 16:25:41.238816 2712679360 net.cpp:380] cifar -> label
I0523 16:25:41.238834 2712679360 data_transformer.cpp:25] Loading mean file from: examples/cifar10/mean.binaryproto
I0523 16:25:41.238957 2712679360 data_layer.cpp:45] output data size: 100,3,32,32
I0523 16:25:41.246219 2712679360 net.cpp:122] Setting up cifar
I0523 16:25:41.246245 2712679360 net.cpp:129] Top shape: 100 3 32 32 (307200)
I0523 16:25:41.246253 2712679360 net.cpp:129] Top shape: 100 (100)
I0523 16:25:41.246258 2712679360 net.cpp:137] Memory required for data: 1229200
I0523 16:25:41.246266 2712679360 layer_factory.hpp:77] Creating layer label_cifar_1_split
I0523 16:25:41.246278 2712679360 net.cpp:84] Creating Layer label_cifar_1_split
I0523 16:25:41.246282 2712679360 net.cpp:406] label_cifar_1_split <- label
I0523 16:25:41.246343 2712679360 net.cpp:380] label_cifar_1_split -> label_cifar_1_split_0
I0523 16:25:41.246367 2712679360 net.cpp:380] label_cifar_1_split -> label_cifar_1_split_1
I0523 16:25:41.246381 2712679360 net.cpp:122] Setting up label_cifar_1_split
I0523 16:25:41.246390 2712679360 net.cpp:129] Top shape: 100 (100)
I0523 16:25:41.246400 2712679360 net.cpp:129] Top shape: 100 (100)
I0523 16:25:41.246409 2712679360 net.cpp:137] Memory required for data: 1230000
I0523 16:25:41.246417 2712679360 layer_factory.hpp:77] Creating layer conv1
I0523 16:25:41.246438 2712679360 net.cpp:84] Creating Layer conv1
I0523 16:25:41.246448 2712679360 net.cpp:406] conv1 <- data
I0523 16:25:41.246457 2712679360 net.cpp:380] conv1 -> conv1
I0523 16:25:41.246606 2712679360 net.cpp:122] Setting up conv1
I0523 16:25:41.246637 2712679360 net.cpp:129] Top shape: 100 32 32 32 (3276800)
I0523 16:25:41.246680 2712679360 net.cpp:137] Memory required for data: 14337200
I0523 16:25:41.246693 2712679360 layer_factory.hpp:77] Creating layer pool1
I0523 16:25:41.246708 2712679360 net.cpp:84] Creating Layer pool1
I0523 16:25:41.246721 2712679360 net.cpp:406] pool1 <- conv1
I0523 16:25:41.246731 2712679360 net.cpp:380] pool1 -> pool1
I0523 16:25:41.246752 2712679360 net.cpp:122] Setting up pool1
I0523 16:25:41.246781 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 16:25:41.246788 2712679360 net.cpp:137] Memory required for data: 17614000
I0523 16:25:41.246793 2712679360 layer_factory.hpp:77] Creating layer relu1
I0523 16:25:41.246804 2712679360 net.cpp:84] Creating Layer relu1
I0523 16:25:41.246809 2712679360 net.cpp:406] relu1 <- pool1
I0523 16:25:41.246814 2712679360 net.cpp:367] relu1 -> pool1 (in-place)
I0523 16:25:41.246821 2712679360 net.cpp:122] Setting up relu1
I0523 16:25:41.246825 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 16:25:41.246830 2712679360 net.cpp:137] Memory required for data: 20890800
I0523 16:25:41.246834 2712679360 layer_factory.hpp:77] Creating layer conv2
I0523 16:25:41.246841 2712679360 net.cpp:84] Creating Layer conv2
I0523 16:25:41.246846 2712679360 net.cpp:406] conv2 <- pool1
I0523 16:25:41.246851 2712679360 net.cpp:380] conv2 -> conv2
I0523 16:25:41.247228 2712679360 net.cpp:122] Setting up conv2
I0523 16:25:41.247236 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 16:25:41.247242 2712679360 net.cpp:137] Memory required for data: 24167600
I0523 16:25:41.247249 2712679360 layer_factory.hpp:77] Creating layer relu2
I0523 16:25:41.247259 2712679360 net.cpp:84] Creating Layer relu2
I0523 16:25:41.247264 2712679360 net.cpp:406] relu2 <- conv2
I0523 16:25:41.247269 2712679360 net.cpp:367] relu2 -> conv2 (in-place)
I0523 16:25:41.247274 2712679360 net.cpp:122] Setting up relu2
I0523 16:25:41.247278 2712679360 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0523 16:25:41.247283 2712679360 net.cpp:137] Memory required for data: 27444400
I0523 16:25:41.247287 2712679360 layer_factory.hpp:77] Creating layer pool2
I0523 16:25:41.247293 2712679360 net.cpp:84] Creating Layer pool2
I0523 16:25:41.247298 2712679360 net.cpp:406] pool2 <- conv2
I0523 16:25:41.247301 2712679360 net.cpp:380] pool2 -> pool2
I0523 16:25:41.247308 2712679360 net.cpp:122] Setting up pool2
I0523 16:25:41.247313 2712679360 net.cpp:129] Top shape: 100 32 8 8 (204800)
I0523 16:25:41.247318 2712679360 net.cpp:137] Memory required for data: 28263600
I0523 16:25:41.247321 2712679360 layer_factory.hpp:77] Creating layer conv3
I0523 16:25:41.247329 2712679360 net.cpp:84] Creating Layer conv3
I0523 16:25:41.247334 2712679360 net.cpp:406] conv3 <- pool2
I0523 16:25:41.247339 2712679360 net.cpp:380] conv3 -> conv3
I0523 16:25:41.248001 2712679360 net.cpp:122] Setting up conv3
I0523 16:25:41.248008 2712679360 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0523 16:25:41.248013 2712679360 net.cpp:137] Memory required for data: 29902000
I0523 16:25:41.248020 2712679360 layer_factory.hpp:77] Creating layer relu3
I0523 16:25:41.248025 2712679360 net.cpp:84] Creating Layer relu3
I0523 16:25:41.248051 2712679360 net.cpp:406] relu3 <- conv3
I0523 16:25:41.248057 2712679360 net.cpp:367] relu3 -> conv3 (in-place)
I0523 16:25:41.248067 2712679360 net.cpp:122] Setting up relu3
I0523 16:25:41.248072 2712679360 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0523 16:25:41.248077 2712679360 net.cpp:137] Memory required for data: 31540400
I0523 16:25:41.248081 2712679360 layer_factory.hpp:77] Creating layer pool3
I0523 16:25:41.248085 2712679360 net.cpp:84] Creating Layer pool3
I0523 16:25:41.248090 2712679360 net.cpp:406] pool3 <- conv3
I0523 16:25:41.248095 2712679360 net.cpp:380] pool3 -> pool3
I0523 16:25:41.248102 2712679360 net.cpp:122] Setting up pool3
I0523 16:25:41.248109 2712679360 net.cpp:129] Top shape: 100 64 4 4 (102400)
I0523 16:25:41.248114 2712679360 net.cpp:137] Memory required for data: 31950000
I0523 16:25:41.248117 2712679360 layer_factory.hpp:77] Creating layer ip1
I0523 16:25:41.248124 2712679360 net.cpp:84] Creating Layer ip1
I0523 16:25:41.248152 2712679360 net.cpp:406] ip1 <- pool3
I0523 16:25:41.248162 2712679360 net.cpp:380] ip1 -> ip1
I0523 16:25:41.248950 2712679360 net.cpp:122] Setting up ip1
I0523 16:25:41.248993 2712679360 net.cpp:129] Top shape: 100 64 (6400)
I0523 16:25:41.249008 2712679360 net.cpp:137] Memory required for data: 31975600
I0523 16:25:41.249014 2712679360 layer_factory.hpp:77] Creating layer ip2
I0523 16:25:41.249020 2712679360 net.cpp:84] Creating Layer ip2
I0523 16:25:41.249024 2712679360 net.cpp:406] ip2 <- ip1
I0523 16:25:41.249038 2712679360 net.cpp:380] ip2 -> ip2
I0523 16:25:41.249080 2712679360 net.cpp:122] Setting up ip2
I0523 16:25:41.249097 2712679360 net.cpp:129] Top shape: 100 10 (1000)
I0523 16:25:41.249102 2712679360 net.cpp:137] Memory required for data: 31979600
I0523 16:25:41.249115 2712679360 layer_factory.hpp:77] Creating layer ip2_ip2_0_split
I0523 16:25:41.249120 2712679360 net.cpp:84] Creating Layer ip2_ip2_0_split
I0523 16:25:41.249125 2712679360 net.cpp:406] ip2_ip2_0_split <- ip2
I0523 16:25:41.249130 2712679360 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_0
I0523 16:25:41.249143 2712679360 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_1
I0523 16:25:41.249150 2712679360 net.cpp:122] Setting up ip2_ip2_0_split
I0523 16:25:41.249155 2712679360 net.cpp:129] Top shape: 100 10 (1000)
I0523 16:25:41.249164 2712679360 net.cpp:129] Top shape: 100 10 (1000)
I0523 16:25:41.249171 2712679360 net.cpp:137] Memory required for data: 31987600
I0523 16:25:41.249174 2712679360 layer_factory.hpp:77] Creating layer accuracy
I0523 16:25:41.249183 2712679360 net.cpp:84] Creating Layer accuracy
I0523 16:25:41.249187 2712679360 net.cpp:406] accuracy <- ip2_ip2_0_split_0
I0523 16:25:41.249191 2712679360 net.cpp:406] accuracy <- label_cifar_1_split_0
I0523 16:25:41.249195 2712679360 net.cpp:380] accuracy -> accuracy
I0523 16:25:41.249202 2712679360 net.cpp:122] Setting up accuracy
I0523 16:25:41.249205 2712679360 net.cpp:129] Top shape: (1)
I0523 16:25:41.249209 2712679360 net.cpp:137] Memory required for data: 31987604
I0523 16:25:41.249214 2712679360 layer_factory.hpp:77] Creating layer loss
I0523 16:25:41.249219 2712679360 net.cpp:84] Creating Layer loss
I0523 16:25:41.249223 2712679360 net.cpp:406] loss <- ip2_ip2_0_split_1
I0523 16:25:41.249236 2712679360 net.cpp:406] loss <- label_cifar_1_split_1
I0523 16:25:41.249241 2712679360 net.cpp:380] loss -> loss
I0523 16:25:41.249249 2712679360 layer_factory.hpp:77] Creating layer loss
I0523 16:25:41.249266 2712679360 net.cpp:122] Setting up loss
I0523 16:25:41.249274 2712679360 net.cpp:129] Top shape: (1)
I0523 16:25:41.249279 2712679360 net.cpp:132]     with loss weight 1
I0523 16:25:41.249300 2712679360 net.cpp:137] Memory required for data: 31987608
I0523 16:25:41.249305 2712679360 net.cpp:198] loss needs backward computation.
I0523 16:25:41.249310 2712679360 net.cpp:200] accuracy does not need backward computation.
I0523 16:25:41.249320 2712679360 net.cpp:198] ip2_ip2_0_split needs backward computation.
I0523 16:25:41.249325 2712679360 net.cpp:198] ip2 needs backward computation.
I0523 16:25:41.249330 2712679360 net.cpp:198] ip1 needs backward computation.
I0523 16:25:41.249366 2712679360 net.cpp:198] pool3 needs backward computation.
I0523 16:25:41.249388 2712679360 net.cpp:198] relu3 needs backward computation.
I0523 16:25:41.249392 2712679360 net.cpp:198] conv3 needs backward computation.
I0523 16:25:41.249408 2712679360 net.cpp:198] pool2 needs backward computation.
I0523 16:25:41.249413 2712679360 net.cpp:198] relu2 needs backward computation.
I0523 16:25:41.249416 2712679360 net.cpp:198] conv2 needs backward computation.
I0523 16:25:41.249420 2712679360 net.cpp:198] relu1 needs backward computation.
I0523 16:25:41.249424 2712679360 net.cpp:198] pool1 needs backward computation.
I0523 16:25:41.249428 2712679360 net.cpp:198] conv1 needs backward computation.
I0523 16:25:41.249431 2712679360 net.cpp:200] label_cifar_1_split does not need backward computation.
I0523 16:25:41.249436 2712679360 net.cpp:200] cifar does not need backward computation.
I0523 16:25:41.249439 2712679360 net.cpp:242] This network produces output accuracy
I0523 16:25:41.249444 2712679360 net.cpp:242] This network produces output loss
I0523 16:25:41.249451 2712679360 net.cpp:255] Network initialization done.
I0523 16:25:41.251152 2712679360 hdf5.cpp:32] Datatype class: H5T_FLOAT
I0523 16:25:41.252013 2712679360 caffe.cpp:290] Running for 100 iterations.
I0523 16:25:41.367466 2712679360 caffe.cpp:313] Batch 0, accuracy = 0.81
I0523 16:25:41.367501 2712679360 caffe.cpp:313] Batch 0, loss = 0.650321
I0523 16:25:41.465518 2712679360 caffe.cpp:313] Batch 1, accuracy = 0.75
I0523 16:25:41.465550 2712679360 caffe.cpp:313] Batch 1, loss = 0.767328
I0523 16:25:41.560680 2712679360 caffe.cpp:313] Batch 2, accuracy = 0.71
I0523 16:25:41.560712 2712679360 caffe.cpp:313] Batch 2, loss = 0.810281
I0523 16:25:41.656878 2712679360 caffe.cpp:313] Batch 3, accuracy = 0.7
I0523 16:25:41.656913 2712679360 caffe.cpp:313] Batch 3, loss = 0.807916
I0523 16:25:41.757275 2712679360 caffe.cpp:313] Batch 4, accuracy = 0.71
I0523 16:25:41.757313 2712679360 caffe.cpp:313] Batch 4, loss = 0.797028
I0523 16:25:41.855583 2712679360 caffe.cpp:313] Batch 5, accuracy = 0.84
I0523 16:25:41.855613 2712679360 caffe.cpp:313] Batch 5, loss = 0.422262
I0523 16:25:41.953912 2712679360 caffe.cpp:313] Batch 6, accuracy = 0.73
I0523 16:25:41.953946 2712679360 caffe.cpp:313] Batch 6, loss = 0.696204
I0523 16:25:42.052671 2712679360 caffe.cpp:313] Batch 7, accuracy = 0.72
I0523 16:25:42.052705 2712679360 caffe.cpp:313] Batch 7, loss = 0.896313
I0523 16:25:42.155107 2712679360 caffe.cpp:313] Batch 8, accuracy = 0.73
I0523 16:25:42.155153 2712679360 caffe.cpp:313] Batch 8, loss = 0.862504
I0523 16:25:42.258592 2712679360 caffe.cpp:313] Batch 9, accuracy = 0.78
I0523 16:25:42.258627 2712679360 caffe.cpp:313] Batch 9, loss = 0.642714
I0523 16:25:42.362510 2712679360 caffe.cpp:313] Batch 10, accuracy = 0.75
I0523 16:25:42.362543 2712679360 caffe.cpp:313] Batch 10, loss = 0.827924
I0523 16:25:42.463922 2712679360 caffe.cpp:313] Batch 11, accuracy = 0.76
I0523 16:25:42.463953 2712679360 caffe.cpp:313] Batch 11, loss = 0.674977
I0523 16:25:42.567791 2712679360 caffe.cpp:313] Batch 12, accuracy = 0.7
I0523 16:25:42.567822 2712679360 caffe.cpp:313] Batch 12, loss = 0.717463
I0523 16:25:42.664435 2712679360 caffe.cpp:313] Batch 13, accuracy = 0.75
I0523 16:25:42.664469 2712679360 caffe.cpp:313] Batch 13, loss = 0.640668
I0523 16:25:42.759980 2712679360 caffe.cpp:313] Batch 14, accuracy = 0.78
I0523 16:25:42.760013 2712679360 caffe.cpp:313] Batch 14, loss = 0.62553
I0523 16:25:42.856386 2712679360 caffe.cpp:313] Batch 15, accuracy = 0.76
I0523 16:25:42.856417 2712679360 caffe.cpp:313] Batch 15, loss = 0.721462
I0523 16:25:42.954746 2712679360 caffe.cpp:313] Batch 16, accuracy = 0.73
I0523 16:25:42.954777 2712679360 caffe.cpp:313] Batch 16, loss = 0.858499
I0523 16:25:43.053562 2712679360 caffe.cpp:313] Batch 17, accuracy = 0.75
I0523 16:25:43.053593 2712679360 caffe.cpp:313] Batch 17, loss = 0.746772
I0523 16:25:43.155479 2712679360 caffe.cpp:313] Batch 18, accuracy = 0.74
I0523 16:25:43.155508 2712679360 caffe.cpp:313] Batch 18, loss = 0.893995
I0523 16:25:43.254688 2712679360 caffe.cpp:313] Batch 19, accuracy = 0.68
I0523 16:25:43.254716 2712679360 caffe.cpp:313] Batch 19, loss = 0.943102
I0523 16:25:43.364045 2712679360 caffe.cpp:313] Batch 20, accuracy = 0.7
I0523 16:25:43.364076 2712679360 caffe.cpp:313] Batch 20, loss = 0.786499
I0523 16:25:43.465351 2712679360 caffe.cpp:313] Batch 21, accuracy = 0.76
I0523 16:25:43.465384 2712679360 caffe.cpp:313] Batch 21, loss = 0.742349
I0523 16:25:43.560330 2712679360 caffe.cpp:313] Batch 22, accuracy = 0.8
I0523 16:25:43.560362 2712679360 caffe.cpp:313] Batch 22, loss = 0.707087
I0523 16:25:43.662050 2712679360 caffe.cpp:313] Batch 23, accuracy = 0.69
I0523 16:25:43.662077 2712679360 caffe.cpp:313] Batch 23, loss = 0.854361
I0523 16:25:43.760444 2712679360 caffe.cpp:313] Batch 24, accuracy = 0.74
I0523 16:25:43.760473 2712679360 caffe.cpp:313] Batch 24, loss = 0.844035
I0523 16:25:43.858397 2712679360 caffe.cpp:313] Batch 25, accuracy = 0.68
I0523 16:25:43.858425 2712679360 caffe.cpp:313] Batch 25, loss = 1.02302
I0523 16:25:43.959595 2712679360 caffe.cpp:313] Batch 26, accuracy = 0.82
I0523 16:25:43.959627 2712679360 caffe.cpp:313] Batch 26, loss = 0.493385
I0523 16:25:44.057914 2712679360 caffe.cpp:313] Batch 27, accuracy = 0.76
I0523 16:25:44.057942 2712679360 caffe.cpp:313] Batch 27, loss = 0.78877
I0523 16:25:44.157359 2712679360 caffe.cpp:313] Batch 28, accuracy = 0.78
I0523 16:25:44.157388 2712679360 caffe.cpp:313] Batch 28, loss = 0.709657
I0523 16:25:44.285976 2712679360 caffe.cpp:313] Batch 29, accuracy = 0.78
I0523 16:25:44.286007 2712679360 caffe.cpp:313] Batch 29, loss = 0.674438
I0523 16:25:44.390980 2712679360 caffe.cpp:313] Batch 30, accuracy = 0.79
I0523 16:25:44.391010 2712679360 caffe.cpp:313] Batch 30, loss = 0.65947
I0523 16:25:44.491211 2712679360 caffe.cpp:313] Batch 31, accuracy = 0.77
I0523 16:25:44.491241 2712679360 caffe.cpp:313] Batch 31, loss = 0.716022
I0523 16:25:44.593423 2712679360 caffe.cpp:313] Batch 32, accuracy = 0.73
I0523 16:25:44.593457 2712679360 caffe.cpp:313] Batch 32, loss = 0.805526
I0523 16:25:44.692994 2712679360 caffe.cpp:313] Batch 33, accuracy = 0.68
I0523 16:25:44.693023 2712679360 caffe.cpp:313] Batch 33, loss = 0.903316
I0523 16:25:44.795087 2712679360 caffe.cpp:313] Batch 34, accuracy = 0.72
I0523 16:25:44.795116 2712679360 caffe.cpp:313] Batch 34, loss = 0.834438
I0523 16:25:44.897828 2712679360 caffe.cpp:313] Batch 35, accuracy = 0.73
I0523 16:25:44.897874 2712679360 caffe.cpp:313] Batch 35, loss = 0.908751
I0523 16:25:44.996119 2712679360 caffe.cpp:313] Batch 36, accuracy = 0.74
I0523 16:25:44.996150 2712679360 caffe.cpp:313] Batch 36, loss = 0.981981
I0523 16:25:45.093991 2712679360 caffe.cpp:313] Batch 37, accuracy = 0.76
I0523 16:25:45.094023 2712679360 caffe.cpp:313] Batch 37, loss = 0.725703
I0523 16:25:45.195551 2712679360 caffe.cpp:313] Batch 38, accuracy = 0.78
I0523 16:25:45.195585 2712679360 caffe.cpp:313] Batch 38, loss = 0.686703
I0523 16:25:45.292881 2712679360 caffe.cpp:313] Batch 39, accuracy = 0.8
I0523 16:25:45.292912 2712679360 caffe.cpp:313] Batch 39, loss = 0.650689
I0523 16:25:45.397084 2712679360 caffe.cpp:313] Batch 40, accuracy = 0.79
I0523 16:25:45.397115 2712679360 caffe.cpp:313] Batch 40, loss = 0.755663
I0523 16:25:45.495128 2712679360 caffe.cpp:313] Batch 41, accuracy = 0.82
I0523 16:25:45.495160 2712679360 caffe.cpp:313] Batch 41, loss = 0.855221
I0523 16:25:45.597597 2712679360 caffe.cpp:313] Batch 42, accuracy = 0.81
I0523 16:25:45.597626 2712679360 caffe.cpp:313] Batch 42, loss = 0.552907
I0523 16:25:45.695441 2712679360 caffe.cpp:313] Batch 43, accuracy = 0.8
I0523 16:25:45.695472 2712679360 caffe.cpp:313] Batch 43, loss = 0.688889
I0523 16:25:45.796842 2712679360 caffe.cpp:313] Batch 44, accuracy = 0.8
I0523 16:25:45.796875 2712679360 caffe.cpp:313] Batch 44, loss = 0.713613
I0523 16:25:45.899427 2712679360 caffe.cpp:313] Batch 45, accuracy = 0.76
I0523 16:25:45.899462 2712679360 caffe.cpp:313] Batch 45, loss = 0.819739
I0523 16:25:46.003129 2712679360 caffe.cpp:313] Batch 46, accuracy = 0.77
I0523 16:25:46.003190 2712679360 caffe.cpp:313] Batch 46, loss = 0.79499
I0523 16:25:46.101080 2712679360 caffe.cpp:313] Batch 47, accuracy = 0.73
I0523 16:25:46.101112 2712679360 caffe.cpp:313] Batch 47, loss = 0.784097
I0523 16:25:46.199532 2712679360 caffe.cpp:313] Batch 48, accuracy = 0.82
I0523 16:25:46.199563 2712679360 caffe.cpp:313] Batch 48, loss = 0.509592
I0523 16:25:46.296840 2712679360 caffe.cpp:313] Batch 49, accuracy = 0.76
I0523 16:25:46.296872 2712679360 caffe.cpp:313] Batch 49, loss = 0.775396
I0523 16:25:46.399880 2712679360 caffe.cpp:313] Batch 50, accuracy = 0.77
I0523 16:25:46.399914 2712679360 caffe.cpp:313] Batch 50, loss = 0.61452
I0523 16:25:46.500458 2712679360 caffe.cpp:313] Batch 51, accuracy = 0.79
I0523 16:25:46.500488 2712679360 caffe.cpp:313] Batch 51, loss = 0.631971
I0523 16:25:46.599107 2712679360 caffe.cpp:313] Batch 52, accuracy = 0.78
I0523 16:25:46.599139 2712679360 caffe.cpp:313] Batch 52, loss = 0.613152
I0523 16:25:46.699442 2712679360 caffe.cpp:313] Batch 53, accuracy = 0.74
I0523 16:25:46.699475 2712679360 caffe.cpp:313] Batch 53, loss = 0.813763
I0523 16:25:46.802717 2712679360 caffe.cpp:313] Batch 54, accuracy = 0.69
I0523 16:25:46.802749 2712679360 caffe.cpp:313] Batch 54, loss = 0.79753
I0523 16:25:46.903400 2712679360 caffe.cpp:313] Batch 55, accuracy = 0.81
I0523 16:25:46.903430 2712679360 caffe.cpp:313] Batch 55, loss = 0.683275
I0523 16:25:47.007345 2712679360 caffe.cpp:313] Batch 56, accuracy = 0.78
I0523 16:25:47.007377 2712679360 caffe.cpp:313] Batch 56, loss = 0.785579
I0523 16:25:47.107044 2712679360 caffe.cpp:313] Batch 57, accuracy = 0.84
I0523 16:25:47.107076 2712679360 caffe.cpp:313] Batch 57, loss = 0.455638
I0523 16:25:47.204998 2712679360 caffe.cpp:313] Batch 58, accuracy = 0.7
I0523 16:25:47.205029 2712679360 caffe.cpp:313] Batch 58, loss = 0.685973
I0523 16:25:47.307816 2712679360 caffe.cpp:313] Batch 59, accuracy = 0.74
I0523 16:25:47.307848 2712679360 caffe.cpp:313] Batch 59, loss = 0.815847
I0523 16:25:47.409512 2712679360 caffe.cpp:313] Batch 60, accuracy = 0.79
I0523 16:25:47.409544 2712679360 caffe.cpp:313] Batch 60, loss = 0.694609
I0523 16:25:47.509786 2712679360 caffe.cpp:313] Batch 61, accuracy = 0.72
I0523 16:25:47.509819 2712679360 caffe.cpp:313] Batch 61, loss = 0.721049
I0523 16:25:47.608265 2712679360 caffe.cpp:313] Batch 62, accuracy = 0.76
I0523 16:25:47.608304 2712679360 caffe.cpp:313] Batch 62, loss = 0.649006
I0523 16:25:47.711271 2712679360 caffe.cpp:313] Batch 63, accuracy = 0.77
I0523 16:25:47.711302 2712679360 caffe.cpp:313] Batch 63, loss = 0.620039
I0523 16:25:47.812440 2712679360 caffe.cpp:313] Batch 64, accuracy = 0.71
I0523 16:25:47.812471 2712679360 caffe.cpp:313] Batch 64, loss = 0.706689
I0523 16:25:47.911661 2712679360 caffe.cpp:313] Batch 65, accuracy = 0.77
I0523 16:25:47.911694 2712679360 caffe.cpp:313] Batch 65, loss = 0.824431
I0523 16:25:48.011318 2712679360 caffe.cpp:313] Batch 66, accuracy = 0.73
I0523 16:25:48.011351 2712679360 caffe.cpp:313] Batch 66, loss = 0.739382
I0523 16:25:48.117573 2712679360 caffe.cpp:313] Batch 67, accuracy = 0.7
I0523 16:25:48.117606 2712679360 caffe.cpp:313] Batch 67, loss = 0.800725
I0523 16:25:48.214515 2712679360 caffe.cpp:313] Batch 68, accuracy = 0.68
I0523 16:25:48.214545 2712679360 caffe.cpp:313] Batch 68, loss = 0.807705
I0523 16:25:48.314254 2712679360 caffe.cpp:313] Batch 69, accuracy = 0.7
I0523 16:25:48.314283 2712679360 caffe.cpp:313] Batch 69, loss = 0.952385
I0523 16:25:48.412657 2712679360 caffe.cpp:313] Batch 70, accuracy = 0.74
I0523 16:25:48.412686 2712679360 caffe.cpp:313] Batch 70, loss = 0.781932
I0523 16:25:48.512931 2712679360 caffe.cpp:313] Batch 71, accuracy = 0.73
I0523 16:25:48.512964 2712679360 caffe.cpp:313] Batch 71, loss = 0.895561
I0523 16:25:48.608669 2712679360 caffe.cpp:313] Batch 72, accuracy = 0.8
I0523 16:25:48.608700 2712679360 caffe.cpp:313] Batch 72, loss = 0.615967
I0523 16:25:48.705847 2712679360 caffe.cpp:313] Batch 73, accuracy = 0.78
I0523 16:25:48.705878 2712679360 caffe.cpp:313] Batch 73, loss = 0.588951
I0523 16:25:48.803540 2712679360 caffe.cpp:313] Batch 74, accuracy = 0.72
I0523 16:25:48.803591 2712679360 caffe.cpp:313] Batch 74, loss = 0.784208
I0523 16:25:48.906528 2712679360 caffe.cpp:313] Batch 75, accuracy = 0.77
I0523 16:25:48.906565 2712679360 caffe.cpp:313] Batch 75, loss = 0.529825
I0523 16:25:49.007186 2712679360 caffe.cpp:313] Batch 76, accuracy = 0.77
I0523 16:25:49.007216 2712679360 caffe.cpp:313] Batch 76, loss = 0.794115
I0523 16:25:49.107000 2712679360 caffe.cpp:313] Batch 77, accuracy = 0.76
I0523 16:25:49.107033 2712679360 caffe.cpp:313] Batch 77, loss = 0.726804
I0523 16:25:49.205263 2712679360 caffe.cpp:313] Batch 78, accuracy = 0.77
I0523 16:25:49.205294 2712679360 caffe.cpp:313] Batch 78, loss = 0.919712
I0523 16:25:49.304277 2712679360 caffe.cpp:313] Batch 79, accuracy = 0.69
I0523 16:25:49.304309 2712679360 caffe.cpp:313] Batch 79, loss = 0.87618
I0523 16:25:49.404642 2712679360 caffe.cpp:313] Batch 80, accuracy = 0.77
I0523 16:25:49.404672 2712679360 caffe.cpp:313] Batch 80, loss = 0.704637
I0523 16:25:49.501708 2712679360 caffe.cpp:313] Batch 81, accuracy = 0.75
I0523 16:25:49.501739 2712679360 caffe.cpp:313] Batch 81, loss = 0.71787
I0523 16:25:49.599267 2712679360 caffe.cpp:313] Batch 82, accuracy = 0.76
I0523 16:25:49.599304 2712679360 caffe.cpp:313] Batch 82, loss = 0.613339
I0523 16:25:49.698971 2712679360 caffe.cpp:313] Batch 83, accuracy = 0.78
I0523 16:25:49.699002 2712679360 caffe.cpp:313] Batch 83, loss = 0.689216
I0523 16:25:49.803320 2712679360 caffe.cpp:313] Batch 84, accuracy = 0.72
I0523 16:25:49.803352 2712679360 caffe.cpp:313] Batch 84, loss = 0.817351
I0523 16:25:49.904433 2712679360 caffe.cpp:313] Batch 85, accuracy = 0.78
I0523 16:25:49.904467 2712679360 caffe.cpp:313] Batch 85, loss = 0.62069
I0523 16:25:50.005846 2712679360 caffe.cpp:313] Batch 86, accuracy = 0.75
I0523 16:25:50.005878 2712679360 caffe.cpp:313] Batch 86, loss = 0.680651
I0523 16:25:50.103121 2712679360 caffe.cpp:313] Batch 87, accuracy = 0.78
I0523 16:25:50.103153 2712679360 caffe.cpp:313] Batch 87, loss = 0.788875
I0523 16:25:50.200103 2712679360 caffe.cpp:313] Batch 88, accuracy = 0.8
I0523 16:25:50.200134 2712679360 caffe.cpp:313] Batch 88, loss = 0.620548
I0523 16:25:50.299957 2712679360 caffe.cpp:313] Batch 89, accuracy = 0.74
I0523 16:25:50.299989 2712679360 caffe.cpp:313] Batch 89, loss = 0.779962
I0523 16:25:50.399699 2712679360 caffe.cpp:313] Batch 90, accuracy = 0.75
I0523 16:25:50.399731 2712679360 caffe.cpp:313] Batch 90, loss = 0.70084
I0523 16:25:50.502117 2712679360 caffe.cpp:313] Batch 91, accuracy = 0.79
I0523 16:25:50.502148 2712679360 caffe.cpp:313] Batch 91, loss = 0.576651
I0523 16:25:50.599150 2712679360 caffe.cpp:313] Batch 92, accuracy = 0.71
I0523 16:25:50.599181 2712679360 caffe.cpp:313] Batch 92, loss = 0.9778
I0523 16:25:50.699782 2712679360 caffe.cpp:313] Batch 93, accuracy = 0.78
I0523 16:25:50.699813 2712679360 caffe.cpp:313] Batch 93, loss = 0.795732
I0523 16:25:50.802847 2712679360 caffe.cpp:313] Batch 94, accuracy = 0.77
I0523 16:25:50.802877 2712679360 caffe.cpp:313] Batch 94, loss = 0.803904
I0523 16:25:50.900668 2712679360 caffe.cpp:313] Batch 95, accuracy = 0.77
I0523 16:25:50.900702 2712679360 caffe.cpp:313] Batch 95, loss = 0.664654
I0523 16:25:50.902439 102174720 data_layer.cpp:73] Restarting data prefetching from start.
I0523 16:25:50.999625 2712679360 caffe.cpp:313] Batch 96, accuracy = 0.74
I0523 16:25:50.999656 2712679360 caffe.cpp:313] Batch 96, loss = 0.700099
I0523 16:25:51.100697 2712679360 caffe.cpp:313] Batch 97, accuracy = 0.66
I0523 16:25:51.100728 2712679360 caffe.cpp:313] Batch 97, loss = 0.937044
I0523 16:25:51.201591 2712679360 caffe.cpp:313] Batch 98, accuracy = 0.79
I0523 16:25:51.201622 2712679360 caffe.cpp:313] Batch 98, loss = 0.677679
I0523 16:25:51.299702 2712679360 caffe.cpp:313] Batch 99, accuracy = 0.76
I0523 16:25:51.299736 2712679360 caffe.cpp:313] Batch 99, loss = 0.687144
I0523 16:25:51.299741 2712679360 caffe.cpp:318] Loss: 0.742307
I0523 16:25:51.299762 2712679360 caffe.cpp:330] accuracy = 0.754
I0523 16:25:51.299773 2712679360 caffe.cpp:330] loss = 0.742307 (* 1 = 0.742307 loss)

得到最终的测试集准确率可以到达accuracy = 0.754

到这里我们对于练习 cifar10模型
就结束了.