自动编码器未在keras中显示预期的输出

时间:2019-03-22 21:08:58

标签: python tensorflow keras autoencoder

我正在Keras中尝试使用AutoEncoders。我有一个数据集(网络入侵NLS-DATASET),我希望我的自动编码器了解其身份并能够重建输入。

,但似乎自动编码器正在通过输入生成输出方式。

我的AutoEncoder代码如下所示:

var grid = new PF.Grid(6, 6);
grid.setWalkableAt(0, 1, false);
grid.setWalkableAt(1, 1, false);
grid.setWalkableAt(2, 1, false);
grid.setWalkableAt(3, 1, false);
grid.setWalkableAt(4, 1, false);
grid.setWalkableAt(1, 2, false);
grid.setWalkableAt(0, 3, false);
grid.setWalkableAt(1, 3, false);
var finder = new PF.AStarFinder();
var path1 = finder.findPath(0, 0, 2, 2, grid);
var path2 = finder.findPath(0, 0, 0, 2, grid);
var path3 = finder.findPath(0, 0, 5, 0, grid);
console.log(path1.length)
console.log(path2.length)
console.log(path3.length)

输入如下:

import keras
from keras.layers import Input, Dense
from keras.models import Model
input_event= Input(shape=(37,))
# encoded and decoded layer for the autoencoder
encoded = Dense(units=35, activation='linear')(input_event)
encoded = Dense(units=30 ,activation='linear')(encoded)
encoded = Dense(units=25 ,activation='linear')(encoded)
encoded = Dense(units=15, activation='linear')(encoded)
decoded = Dense(units=25, activation='linear')(encoded)
decoded = Dense(units=30, activation='linear')(decoded)
decoded = Dense(units=35, activation='linear')(decoded)
decoded = Dense(units=37, activation='sigmoid')(decoded)

# Building autoencoder
autoencoder=Model(input_event, decoded)
# compiling the autoencoder
#binary_crossentropy
autoencoder.compile(optimizer='adadelta', loss='mean_squared_error', metrics=['accuracy'])
# Fitting the noise trained data to the autoencoder 
autoencoder.fit(train_preprocessed, train_preprocessed,
                epochs=500,
                batch_size=500,
                shuffle=True
                )

和相同样本的重构输入如下所示:

array([  1.44929445e-04,   0.00000000e+00,   0.00000000e+00,
         0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
         0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
         0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
         0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
         0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
         1.95694716e-03,   1.95694716e-03,   0.00000000e+00,
         0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
         1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
         5.25490196e-01,   3.37254902e-01,   6.10000000e-01,
         4.00000000e-02,   6.10000000e-01,   2.00000000e-02,
         0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
         0.00000000e+00])

我尝试过在各层之间使用不同的激活函数,我尝试了不同的层深度。不同的损失函数。

如果您能发现我所做的任何错误,我将不胜感激。

谢谢。

0 个答案:

没有答案