将编码器从SAE连接到LSTM

时间:2019-03-14 16:16:03

标签: tensorflow keras lstm autoencoder

我使用顺序自动编码器查看了相同的问题:

Connect Encoder from AutoEncoder to LSTM

我的问题类似,但是我想使用堆叠式自动编码器。 代码如下:

import numpy as np
from keras.layers import Input, Dense, LSTM
from keras.models import Model

features = 20
timesteps = 16
encoding_dim = 10


sig = np.random.rand(4000, timesteps, features)



input_img = Input(shape=(timesteps,features))
encoded = Dense(encoding_dim, activation='relu')(input_img)
decoded = Dense(features, activation='relu')(encoded)

autoencoder = Model(input_img, decoded)
encoder = Model(input_img, encoded)

autoencoder.compile(optimizer='adadelta', loss='mae')
encoder.summary()

autoencoder.fit(x=sig, y=sig,
                epochs=10,
                batch_size=100,
                validation_split=0.2)

lstm_input = Input(shape=(timesteps, features))

inputs = encoder(inputs=lstm_input)

lstm1 = LSTM(encoding_dim, input_shape = (timesteps,encoding_dim), activation='relu', return_sequences = True, stateful = False)(inputs)
lstm2 = LSTM(1, return_sequences = False, stateful = False, activation='relu')(lstm1)

model_lstm = Model(inputs, lstm2)

model_lstm.summary()

但出现以下错误:

/opt/conda/lib/python3.6/site-packages/keras/engine/network.py:180: UserWarning: Model inputs must come from `keras.layers.Input` (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to your model was not an Input tensor, it was generated by layer model_4.
Note that input tensors are instantiated via `tensor = keras.layers.Input(shape)`.
The tensor that caused the issue was: model_4/dense_3/Relu:0
  str(x.name))

有什么想法将编码器正确连接到LSTM吗?

0 个答案:

没有答案