为什么YOLOv3-Darkent的精度会降低,而平均损耗却会降低?

时间:2018-12-03 07:22:09

标签: deep-learning yolo darknet

我准备了约20,000张带标签的照片。我对其进行了扩充,以制作12个班级的训练数据集,其中包含约80,000张带标签的图像。这是cfg:

[net]
batch=96
subdivisions=16

width=480
height=480
channels=3

momentum=0.9
decay=0.0005
angle=9.0
saturation = 1.5
exposure = 1.5
hue=.1

learning_rate=0.00025
burn_in=1000
max_batches = 30000
policy=steps
steps=24000,27000
scales=.1,.1

训练后,结果很奇怪:

Epoch   Prob    Log
 2000   52.4%   2000: 3.736104, 4.614188 avg, 0.001000 rate, 9.903145 seconds, 768000 images
 4000   85.7%   4000: 2.040192, 2.168153 avg, 0.001000 rate, 4.022353 seconds, 1536000 images
 6000   90.1%   6000: 1.334484, 1.247831 avg, 0.001000 rate, 3.964317 seconds, 2304000 images
 8000   98.3%   8000: 1.362864, 1.344616 avg, 0.001000 rate, 3.528447 seconds, 3072000 images
10000   75.9%   10000: 2.488457, 2.152407 avg, 0.001000 rate, 8.545843 seconds, 3840000 images
12000   63.9%   12000: 0.913796, 1.013006 avg, 0.001000 rate, 6.956745 seconds, 4608000 images
14000   83.6%   14000: 0.913707, 0.973148 avg, 0.001000 rate, 5.753258 seconds, 5376000 images
16000   79.2%   16000: 0.977795, 0.977933 avg, 0.001000 rate, 5.900009 seconds, 6144000 images
18000   91.8%   18000: 1.069363, 1.498939 avg, 0.001000 rate, 6.355244 seconds, 6912000 images
20000   81.9%   20000: 1.117647, 1.133991 avg, 0.001000 rate, 4.299940 seconds, 7680000 images
22000   75.4%   22000: 0.859802, 0.878308 avg, 0.001000 rate, 6.393010 seconds, 8448000 images
24000   74.3%   24000: 0.921492, 0.890497 avg, 0.000100 rate, 4.465703 seconds, 9216000 images
26000   77.6%   26000: 0.843734, 0.848669 avg, 0.000100 rate, 4.431544 seconds, 9984000 images
28000           28000: 0.731719, 0.780727 avg, 0.000010 rate, 8.379001 seconds, 10752000 images
30000   53.5%   30000: 0.817636, 0.835514 avg, 0.000010 rate, 10.245856 seconds, 11520000 images  

为什么概率看起来像标准的正态分布?在8000次练习之后,概率降低了,而平均损失却没有增加?

这是由于过度拟合的概念吗?但是平均损失并没有增加。

第二天,我得到了更多的被标记为phothos和一堂课。我将这些照片扩充为这13个班级的总计93,347张带有标签的图像。我使用此配置训练了这些数据集:

[net]
batch=96
subdivisions=16

width=480
height=480
channels=3
momentum=0.9
decay=0.0005
angle=9.0
saturation=1.5
exposure=1.5
hue=.1

learning_rate=0.00025
burn_in=1000

max_batches=20000
policy=steps
steps=16000,18000
scales=.1,.1

结果在这里:

Epoch   Prob    Log
 2000   91.5%   2000: 2.690941, 2.899412 avg, 0.001000 rate, 8.567916 seconds, 768000 images
 4000   99.0%   4000: 2.018585, 2.097449 avg, 0.001000 rate, 6.562051 seconds, 1536000 images
 6000   99.0%   6000: 1.645114, 1.451750 avg, 0.001000 rate, 4.518591 seconds, 2304000 images
 8000   99.0%   8000: 0.995837, 1.090722 avg, 0.001000 rate, 6.309589 seconds, 3072000 images
10000   99.0%   10000: 1.210012, 1.346853 avg, 0.001000 rate, 5.764971 seconds, 3840000 images
12000   99.0%   12000: 1.033169, 0.977759 avg, 0.001000 rate, 9.203438 seconds, 4608000 images
14000   99.0%   14000: 0.888880, 0.943430 avg, 0.001000 rate, 5.597564 seconds, 5376000 images
16000   99.0%   16000: 1.181578, 1.141082 avg, 0.000100 rate, 4.087448 seconds, 6144000 images
18000   99.0%   18000: 0.950411, 0.944034 avg, 0.000010 rate, 3.983405 seconds, 6912000 images
20000   99.0%   20000: 0.828396, 0.861167 avg, 0.000010 rate, 10.150156 seconds, 7680000 images

0 个答案:

没有答案
相关问题