损失不失格,是非常高的keras

时间:2019-07-25 18:50:13

标签: python machine-learning keras neural-network deep-learning

我正在喀拉拉邦学习深度学习,但遇到了问题。 损失没有减少,而且非常高,约为650。

我正在处理tensorflow.keras.datasets.mnist中的MNIST数据集 没有错误,只是我的NN没有学习。

有我的模特:

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Flatten
import tensorflow.nn as tfnn

inputdim = 28 * 28

model = Sequential()

model.add(Flatten())
model.add(Dense(inputdim, activation = tfnn.relu))
model.add(Dense(128, activation = tfnn.relu))
model.add(Dense(10, activation = tfnn.softmax))

model.compile(loss = 'categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])
model.fit(X_train, Y_train, epochs = 4)

和我的输出:

Epoch 1/4
60000/60000 [==============================] - 32s 527us/sample - loss: 646.0926 - acc: 6.6667e-05
Epoch 2/4
60000/60000 [==============================] - 39s 652us/sample - loss: 646.1003 - acc: 0.0000e+00 - l - ETA: 0s - loss: 646.0983 - acc: 0.0000e
Epoch 3/4
60000/60000 [==============================] - 35s 590us/sample - loss: 646.1003 - acc: 0.0000e+00
Epoch 4/4
60000/60000 [==============================] - 33s 544us/sample - loss: 646.1003 - acc: 0.0000e+00
```

2 个答案:

答案 0 :(得分:0)

您可以尝试使用sparse_categorical_crossentropy损失函数。另外,您的批量大小是多少?并且正如已经建议的那样,您可能希望增加时期数。

答案 1 :(得分:0)

好的,我在行之间添加了BatchNormalization,并将损失函数更改为'sparse_categorical_crossentropy'。那就是我的NN的样子:

model = Sequential()

model.add(Flatten())
model.add(BatchNormalization(axis = 1, momentum = 0.99))
model.add(Dense(inputdim, activation = tfnn.relu))
model.add(BatchNormalization(axis = 1, momentum = 0.99))
model.add(Dense(128, activation = tfnn.relu))
model.add(BatchNormalization(axis = 1, momentum = 0.99))
model.add(Dense(10, activation = tfnn.softmax))

model.compile(loss = 'sparse_categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])

那就是结果:

Epoch 1/4
60000/60000 [==============================] - 68s 1ms/sample - loss: 0.2045 - acc: 0.9374
Epoch 2/4
60000/60000 [==============================] - 55s 916us/sample - loss: 0.1007 - acc: 0.9689

感谢您的帮助!