Keras ProgBar(详细= 1)有一个奇怪的输出

时间:2018-05-16 10:30:06

标签: tensorflow keras

我使用keras将一个简单的nn拟合为verbose设置为1,并且每个时期得到此输出:

Epoch 1/10
9357/9357 [==============================] - ETA: 34s - loss: 8.3609 - acc: 
0.0000e+ - ETA: 19s - loss: 8.4473 - acc: 0.0020   - ETA: 15s - loss: 8.4058 
- acc: 0.00 - ETA: 12s - loss: 8.3917 - acc: 9.7656e- - ETA: 11s - loss: 
8.3876 - acc: 7.8125e- - ETA: 9s - loss: 8.3786 - acc: 0.0013     - ETA: 9s 
- loss: 8.3754 - acc: 0.001 - ETA: 8s - loss: 8.3732 - acc: 9.7656e-0 - ETA: 7s 
- loss: 8.3656 - acc: 0.0022    - ETA: 7s - loss: 8.3637 - acc: 0.007 - ETA: 
 6s - loss: 8.3672 - acc: 0.009 - ETA: 6s - loss: 8.3610 - acc: 0.014 - ETA: 
 5s - loss: 8.3597 - acc: 0.017 - ETA: 5s - loss: 8.3559 - acc: 0.021 - ETA: 
 5s - loss: 8.3551 - acc: 0.025 - ETA: 5s - loss: 8.3530 - acc: 0.028 - ETA: 
 4s - loss: 8.3527 - acc: 0.029 - ETA: 4s - loss: 8.3491 - acc: 0.031 - ETA: 4s - 
 loss: 8.3479 - acc: 0.033 - ETA: 3s - loss: 8.3465 - acc: 0.035 - ETA: 3s - 
 loss: 8.3432 - acc: 0.036 - ETA: 3s - loss: 8.3390 - acc: 0.038 - ETA: 3s - 
 loss: 8.3344 - acc: 0.041 - ETA: 2s - loss: 8.3329 - acc: 0.043 - ETA: 2s - 
 loss: 8.3290 - acc: 0.045 - ETA: 2s - loss: 8.3288 - acc: 0.046 - ETA: 2s - 
 loss: 8.3286 - acc: 0.048 - ETA: 1s - loss: 8.3273 - acc: 0.049 - ETA: 1s - 
 loss: 8.3238 - acc: 0.051 - ETA: 1s - loss: 8.3219 - acc: 0.052 - ETA: 1s - 
 loss: 8.3173 - acc: 0.054 - ETA: 0s - loss: 8.3154 - acc: 0.055 - ETA: 0s - 
 loss: 8.3152 - acc: 0.056 - ETA: 0s - loss: 8.3107 - acc: 0.057 - ETA: 0s - 
 loss: 8.3073 - acc: 0.058 - ETA: 0s - loss: 8.3029 - acc: 0.060 - 8s 903us/step - loss: 8.3040 - acc: 0.0604 - val_loss: 8.2866 - val_acc: 0.0588

我使用Keras已经有一段时间了,我从来没有得到过这样的输出。这是我第一次将它与Conv2D层一起使用。这是我的代码,即使我认为它不相关:

cnn_layers = [32 ]
cnn_kernels = [(5,5) ]
dense_layers = [32]


model = Sequential()
history = History()

for i in range(len(cnn_layers)):

  model.add( Conv2D(cnn_layers[i] , cnn_kernels[i] , input_shape = X.shape[1:] , padding = 'same') )
  model.add( BatchNormalization() )
  model.add( Activation('relu') )
  model.add(MaxPooling2D(pool_size=(2, 2)))

model.add( Flatten() )

for i in range(len(dense_layers)):
    model.add( Dense(dense_layers[i] , activation = 'relu'))

model.add( Dense(Y.shape[1] , activation = 'softmax'))

adm = keras.optimizers.Adam(lr = 0.001 , decay = 0)
model.compile(loss='categorical_crossentropy', optimizer= adm , metrics = ['accuracy'])



model.fit(X, Y, epochs= 10, batch_size=256, validation_split = 0.05, callbacks = [history] , verbose = 1)

0 个答案:

没有答案