多层感知器在keras

时间:2018-04-10 17:49:15

标签: python neural-network keras

我在keras中构建了一个多层Perceptron模型。但是我在训练过程中遇到了一个问题:准确度从第二个时期开始变得恒定并且等于1。我试图破坏激活功能和时代数量等但是徒劳无功。

这是脚本:

checkpointer4 = ModelCheckpoint(filepath="modelsup.h5",
                               verbose=0,
                               save_best_only=True,
                               save_weights_only=True)

model = Sequential()
model.add(Dense(64, input_dim=10, activation='relu'))

model.add(Dense(64, activation='relu'))

model.add(Dense(1, activation='sigmoid'))

model.compile(loss='binary_crossentropy',
              optimizer='adam',
              metrics=['accuracy'])
history = model.fit(x_train, y_train,
          epochs=25,
          batch_size=10,
          validation_data=(x_test, y_test),
          shuffle = True,
          verbose=1,
          callbacks=[checkpointer4]).history

model.save_weights("modelsup.h5")
model.load_weights("modelsup.h5")
score = model.evaluate(x_test, y_test, batch_size=25)

print('Test score:', score)

print(model.summary())

以下是结果的一部分:

Epoch 25/25

   10/10000 [..............................] - ETA: 6s - loss: 1.1345e-07 - acc: 1.0000
  120/10000 [..............................] - ETA: 5s - loss: 1.1425e-07 - acc: 1.0000
  230/10000 [..............................] - ETA: 5s - loss: 1.1370e-07 - acc: 1.0000
  320/10000 [..............................] - ETA: 5s - loss: 1.1339e-07 - acc: 1.0000
  390/10000 [>.............................] - ETA: 5s - loss: 1.1325e-07 - acc: 1.0000
  480/10000 [>.............................] - ETA: 5s - loss: 1.1353e-07 - acc: 1.0000
  520/10000 [>.............................] - ETA: 6s - loss: 1.1367e-07 - acc: 1.0000
  580/10000 [>.............................] - ETA: 6s - loss: 1.1358e-07 - acc: 1.0000
  640/10000 [>.............................] - ETA: 6s - loss: 1.1357e-07 - acc: 1.0000
  700/10000 [=>............................] - ETA: 7s - loss: 1.1369e-07 - acc: 1.0000
  760/10000 [=>............................] - ETA: 7s - loss: 1.1380e-07 - acc: 1.0000
  850/10000 [=>............................] - ETA: 6s - loss: 1.1388e-07 - acc: 1.0000
  920/10000 [=>............................] - ETA: 6s - loss: 1.1389e-07 - acc: 1.0000
 1000/10000 [==>...........................] - ETA: 6s - loss: 1.1391e-07 - acc: 1.0000
 1060/10000 [==>...........................] - ETA: 6s - loss: 1.1383e-07 - acc: 1.0000
 1140/10000 [==>...........................] - ETA: 6s - loss: 1.1373e-07 - acc: 1.0000
 1230/10000 [==>...........................] - ETA: 6s - loss: 1.1374e-07 - acc: 1.0000
 1310/10000 [==>...........................] - ETA: 6s - loss: 1.1378e-07 - acc: 1.0000
 1380/10000 [===>..........................] - ETA: 6s - loss: 1.1382e-07 - acc: 1.0000
 1480/10000 [===>..........................] - ETA: 6s - loss: 1.1375e-07 - acc: 1.0000
 1540/10000 [===>..........................] - ETA: 6s - loss: 1.1371e-07 - acc: 1.0000
 1620/10000 [===>..........................] - ETA: 6s - loss: 1.1379e-07 - acc: 1.0000
 1720/10000 [====>.........................] - ETA: 6s - loss: 1.1383e-07 - acc: 1.0000
 1800/10000 [====>.........................] - ETA: 6s - loss: 1.1383e-07 - acc: 1.0000
 1890/10000 [====>.........................] - ETA: 6s - loss: 1.1389e-07 - acc: 1.0000
 1940/10000 [====>.........................] - ETA: 6s - loss: 1.1391e-07 - acc: 1.0000
 2020/10000 [=====>........................] - ETA: 6s - loss: 1.1389e-07 - acc: 1.0000
 2100/10000 [=====>........................] - ETA: 6s - loss: 1.1389e-07 - acc: 1.0000
 2160/10000 [=====>........................] - ETA: 6s - loss: 1.1392e-07 - acc: 1.0000
 2240/10000 [=====>........................] - ETA: 5s - loss: 1.1388e-07 - acc: 1.0000
 2320/10000 [=====>........................] - ETA: 5s - loss: 1.1385e-07 - acc: 1.0000
 2390/10000 [======>.......................] - ETA: 5s - loss: 1.1386e-07 - acc: 1.0000
 2460/10000 [======>.......................] - ETA: 5s - loss: 1.1391e-07 - acc: 1.0000
 2580/10000 [======>.......................] - ETA: 5s - loss: 1.1393e-07 - acc: 1.0000
 2660/10000 [======>.......................] - ETA: 5s - loss: 1.1390e-07 - acc: 1.0000
 2770/10000 [=======>......................] - ETA: 5s - loss: 1.1394e-07 - acc: 1.0000
 2870/10000 [=======>......................] - ETA: 5s - loss: 1.1396e-07 - acc: 1.0000
 2970/10000 [=======>......................] - ETA: 5s - loss: 1.1397e-07 - acc: 1.0000
 3030/10000 [========>.....................] - ETA: 5s - loss: 1.1396e-07 - acc: 1.0000
 3140/10000 [========>.....................] - ETA: 4s - loss: 1.1402e-07 - acc: 1.0000
 3210/10000 [========>.....................] - ETA: 4s - loss: 1.1403e-07 - acc: 1.0000
 3320/10000 [========>.....................] - ETA: 4s - loss: 1.1403e-07 - acc: 1.0000
 3410/10000 [=========>....................] - ETA: 4s - loss: 1.1403e-07 - acc: 1.0000
 3520/10000 [=========>....................] - ETA: 4s - loss: 1.1405e-07 - acc: 1.0000
 3610/10000 [=========>....................] - ETA: 4s - loss: 1.1402e-07 - acc: 1.0000
 3700/10000 [==========>...................] - ETA: 4s - loss: 1.1402e-07 - acc: 1.0000
 3840/10000 [==========>...................] - ETA: 4s - loss: 1.1401e-07 - acc: 1.0000
 3940/10000 [==========>...................] - ETA: 4s - loss: 1.1401e-07 - acc: 1.0000
 4040/10000 [===========>..................] - ETA: 4s - loss: 1.1401e-07 - acc: 1.0000
 4160/10000 [===========>..................] - ETA: 3s - loss: 1.1407e-07 - acc: 1.0000
 4270/10000 [===========>..................] - ETA: 3s - loss: 1.1410e-07 - acc: 1.0000
 4390/10000 [============>.................] - ETA: 3s - loss: 1.1410e-07 - acc: 1.0000
 4510/10000 [============>.................] - ETA: 3s - loss: 1.1411e-07 - acc: 1.0000
 4650/10000 [============>.................] - ETA: 3s - loss: 1.1414e-07 - acc: 1.0000
 4750/10000 [=============>................] - ETA: 3s - loss: 1.1414e-07 - acc: 1.0000
 4850/10000 [=============>................] - ETA: 3s - loss: 1.1414e-07 - acc: 1.0000
 4960/10000 [=============>................] - ETA: 3s - loss: 1.1417e-07 - acc: 1.0000
 5110/10000 [==============>...............] - ETA: 3s - loss: 1.1415e-07 - acc: 1.0000
 5190/10000 [==============>...............] - ETA: 3s - loss: 1.1414e-07 - acc: 1.0000
 5340/10000 [===============>..............] - ETA: 2s - loss: 1.1417e-07 - acc: 1.0000
 5440/10000 [===============>..............] - ETA: 2s - loss: 1.1417e-07 - acc: 1.0000
 5550/10000 [===============>..............] - ETA: 2s - loss: 1.1418e-07 - acc: 1.0000
 5690/10000 [================>.............] - ETA: 2s - loss: 1.1421e-07 - acc: 1.0000
 5800/10000 [================>.............] - ETA: 2s - loss: 1.1421e-07 - acc: 1.0000
 5920/10000 [================>.............] - ETA: 2s - loss: 1.1420e-07 - acc: 1.0000
 6000/10000 [=================>............] - ETA: 2s - loss: 1.1420e-07 - acc: 1.0000
 6100/10000 [=================>............] - ETA: 2s - loss: 1.1420e-07 - acc: 1.0000
 6200/10000 [=================>............] - ETA: 2s - loss: 1.1421e-07 - acc: 1.0000
 6300/10000 [=================>............] - ETA: 2s - loss: 1.1420e-07 - acc: 1.0000
 6360/10000 [==================>...........] - ETA: 2s - loss: 1.1420e-07 - acc: 1.0000
 6450/10000 [==================>...........] - ETA: 2s - loss: 1.1421e-07 - acc: 1.0000
 6540/10000 [==================>...........] - ETA: 2s - loss: 1.1420e-07 - acc: 1.0000
 6640/10000 [==================>...........] - ETA: 2s - loss: 1.1421e-07 - acc: 1.0000
 6740/10000 [===================>..........] - ETA: 2s - loss: 1.1421e-07 - acc: 1.0000
 6830/10000 [===================>..........] - ETA: 1s - loss: 1.1422e-07 - acc: 1.0000
 6920/10000 [===================>..........] - ETA: 1s - loss: 1.1423e-07 - acc: 1.0000
 7030/10000 [====================>.........] - ETA: 1s - loss: 1.1422e-07 - acc: 1.0000
 7150/10000 [====================>.........] - ETA: 1s - loss: 1.1423e-07 - acc: 1.0000
 7280/10000 [====================>.........] - ETA: 1s - loss: 1.1423e-07 - acc: 1.0000
 7360/10000 [=====================>........] - ETA: 1s - loss: 1.1425e-07 - acc: 1.0000
 7510/10000 [=====================>........] - ETA: 1s - loss: 1.1424e-07 - acc: 1.0000
 7630/10000 [=====================>........] - ETA: 1s - loss: 1.1424e-07 - acc: 1.0000
 7730/10000 [======================>.......] - ETA: 1s - loss: 1.1424e-07 - acc: 1.0000
 7880/10000 [======================>.......] - ETA: 1s - loss: 1.1424e-07 - acc: 1.0000
 8000/10000 [=======================>......] - ETA: 1s - loss: 1.1421e-07 - acc: 1.0000
 8110/10000 [=======================>......] - ETA: 1s - loss: 1.1421e-07 - acc: 1.0000
 8200/10000 [=======================>......] - ETA: 1s - loss: 1.1421e-07 - acc: 1.0000
 8300/10000 [=======================>......] - ETA: 1s - loss: 1.1421e-07 - acc: 1.0000
 8420/10000 [========================>.....] - ETA: 0s - loss: 1.1421e-07 - acc: 1.0000
 8540/10000 [========================>.....] - ETA: 0s - loss: 1.1421e-07 - acc: 1.0000
 8620/10000 [========================>.....] - ETA: 0s - loss: 1.1422e-07 - acc: 1.0000
 8680/10000 [=========================>....] - ETA: 0s - loss: 1.1423e-07 - acc: 1.0000
 8790/10000 [=========================>....] - ETA: 0s - loss: 1.1424e-07 - acc: 1.0000
 8900/10000 [=========================>....] - ETA: 0s - loss: 1.1424e-07 - acc: 1.0000
 9020/10000 [==========================>...] - ETA: 0s - loss: 1.1424e-07 - acc: 1.0000
 9130/10000 [==========================>...] - ETA: 0s - loss: 1.1422e-07 - acc: 1.0000
 9260/10000 [==========================>...] - ETA: 0s - loss: 1.1423e-07 - acc: 1.0000
 9370/10000 [===========================>..] - ETA: 0s - loss: 1.1422e-07 - acc: 1.0000
 9470/10000 [===========================>..] - ETA: 0s - loss: 1.1424e-07 - acc: 1.0000
 9510/10000 [===========================>..] - ETA: 0s - loss: 1.1423e-07 - acc: 1.0000
 9560/10000 [===========================>..] - ETA: 0s - loss: 1.1423e-07 - acc: 1.0000
 9630/10000 [===========================>..] - ETA: 0s - loss: 1.1422e-07 - acc: 1.0000
 9700/10000 [============================>.] - ETA: 0s - loss: 1.1423e-07 - acc: 1.0000
 9830/10000 [============================>.] - ETA: 0s - loss: 1.1423e-07 - acc: 1.0000
 9910/10000 [============================>.] - ETA: 0s - loss: 1.1424e-07 - acc: 1.0000
 9990/10000 [============================>.] - ETA: 0s - loss: 1.1425e-07 - acc: 1.0000
10000/10000 [==============================] - 7s 701us/step - loss: 1.1425e-07 - acc: 1.0000 - val_loss: 1.1513e-07 - val_acc: 1.0000

  25/4999 [..............................] - ETA: 0s
 525/4999 [==>...........................] - ETA: 0s
 875/4999 [====>.........................] - ETA: 0s
1475/4999 [=======>......................] - ETA: 0s
2100/4999 [===========>..................] - ETA: 0s
2700/4999 [===============>..............] - ETA: 0s
3225/4999 [==================>...........] - ETA: 0s
4100/4999 [=======================>......] - ETA: 0s
4675/4999 [===========================>..] - ETA: 0s
4999/4999 [==============================] - 0s 96us/step
Test score: [1.1512877142241595e-07, 1.0]

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_13 (Dense)             (None, 64)                704       
_________________________________________________________________
dense_14 (Dense)             (None, 64)                4160      
_________________________________________________________________
dense_15 (Dense)             (None, 1)                 65        
=================================================================
Total params: 4,929
Trainable params: 4,929
Non-trainable params: 0

我正在做二进制分类。以下是数据的显示方式x_train

[[0.69065374 0.         0.27677792 ... 0.         1.0274839  0.48911577]
 [0.4631601  0.04829948 0.11175615 ... 0.09347356 1.8268523  0.        ]
 [0.7308857  0.         0.3192799  ... 0.         2.8403711  0.14755964]
 ...
 [1.3862612  0.         1.0800421  ... 0.8344357  0.8264028  0.        ]
 [2.4669604  0.         1.210294   ... 1.9650785  1.3511596  0.        ]
 [2.246204   0.         1.1608332  ... 1.9253167  1.2738075  0.        ]]

y_train

[0. 0. 0. ... 1. 1. 1.]

首先放置等级为0的样本,然后放入等级为1的样本,因为我分别对这些样本进行了一些其他操作,然后将它们连接在一起以适合模型。

我输入数据后是否有确切的说明?

0 个答案:

没有答案