降低caffe训练输出的细节水平?

时间:2016-01-12 22:02:07

标签: python machine-learning caffe pycaffe

我已经使用Debug标志编译了caffe。 现在我跑的时候

./examples/mnist/train_lenet.sh

我得到输出

I0112 22:50:49.680357 114020 data_layer.cpp:103]      Read time: 0.095 ms.
I0112 22:50:49.680376 114020 data_layer.cpp:104] Transform time: 0.821 ms.
I0112 22:50:49.681077 113921 solver.cpp:409]     Test net output #0: accuracy = 0.9902
I0112 22:50:49.681115 113921 solver.cpp:409]     Test net output #1: loss = 0.0292544 (* 1 = 0.0292544 loss)
I0112 22:50:49.681125 113921 solver.cpp:326] Optimization Done.
I0112 22:50:49.681133 113921 caffe.cpp:215] Optimization Done.
I0112 22:50:49.681915 114020 data_layer.cpp:102] Prefetch batch: 1 ms.
I0112 22:50:49.681929 114020 data_layer.cpp:103]      Read time: 0.095 ms.
I0112 22:50:49.681948 114020 data_layer.cpp:104] Transform time: 0.829 ms.

http://pastebin.com/cbXTH5HH

我想输出没有读取时间和预取时间。没有重新编译。

I1130 00:30:48.030009  5007 solver.cpp:590] Iteration 9900, lr = 0.00596843
I1130 00:30:49.105876  5007 solver.cpp:468] Snapshotting to binary proto file examples/mnist/lenet_iter_10000.caffemodel
I1130 00:30:49.117113  5007 solver.cpp:753] Snapshotting solver state to binary proto file examples/mnist/lenet_iter_10000.solverstate
I1130 00:30:49.125869  5007 solver.cpp:327] Iteration 10000, loss = 0.00332428
I1130 00:30:49.125888  5007 solver.cpp:347] Iteration 10000, Testing net (#0)
I1130 00:30:49.722595  5007 solver.cpp:415]     Test net output #0: accuracy = 0.9905
I1130 00:30:49.722626  5007 solver.cpp:415]     Test net output #1: loss = 0.0302176 (* 1 = 0.0302176 loss)
I1130 00:30:49.722642  5007 solver.cpp:332] Optimization Done.
I1130 00:30:49.722647  5007 caffe.cpp:215] Optimization Done.

例如: - http://pastebin.com/F5c3Yutu

1 个答案:

答案 0 :(得分:5)

怎么样?
ViewPager

给了我

./examples/mnist/train_lenet.sh | grep -v "Read time:" | grep -v "Prefetch batch:"

来自您的样本输入。