深度学习 - 应该将损失值除以 `seq_length`

时间:2021-02-18 20:25:47

标签: python deep-learning

我是深度学习的新手,我正在这里查看 train.py:https://github.com/huanghao-code/VisRNN_ICLR_2016_Text/blob/master/train.py

我运行这段代码,损失值非常高(附上)。我认为应该除以seq_length(在配置文件中定义为70)。我的意思是,将这段代码添加到第 61 行:loss = loss / seq_length

我说得对吗?

[1,   10] loss: 239.628
[1,   20] loss: 199.850
[1,   30] loss: 179.427
[1,   40] loss: 167.309
[1,   50] loss: 161.275
[1,   60] loss: 153.391
[1,   70] loss: 150.944
[1,   80] loss: 148.133
[1,   90] loss: 144.675
[1,  100] loss: 139.971
[1,  110] loss: 143.109
[1,  120] loss: 139.113

  2%|▏         | 1/50 [01:05<53:49, 65.90s/it]Training for 2 epochs...
[2,   10] loss: 137.888
[2,   20] loss: 130.642
[2,   30] loss: 135.233
[2,   40] loss: 131.931
[2,   50] loss: 131.866
[2,   60] loss: 128.675
[2,   70] loss: 130.911
[2,   80] loss: 130.391
[2,   90] loss: 127.622
[2,  100] loss: 122.895
[2,  110] loss: 130.114
[2,  120] loss: 127.582

....


 96%|█████████▌| 48/50 [52:11<02:07, 63.99s/it]Training for 49 epochs...
[49,   10] loss: 100.158
[49,   20] loss: 96.960
[49,   30] loss: 97.161
[49,   40] loss: 95.056
[49,   50] loss: 96.596
[49,   60] loss: 97.120
[49,   70] loss: 100.471
[49,   80] loss: 101.352
[49,   90] loss: 98.160
[49,  100] loss: 94.098
[49,  120] loss: 102.465

 98%|█████████▊| 49/50 [53:15<01:03, 63.94s/it]Training for 50 epochs...
[50,   10] loss: 100.296
[50,   20] loss: 96.931
[50,   30] loss: 97.210
[50,   40] loss: 94.970
[50,   50] loss: 96.643
[50,   60] loss: 96.761
[50,   70] loss: 100.220
[50,   80] loss: 101.168
[50,   90] loss: 97.711
[50,  100] loss: 93.862
[50,  110] loss: 102.333
[50,  120] loss: 102.228

0 个答案:

没有答案