如何在张量流Keras中将卷积层与LSTM连接

时间:2019-03-30 11:40:59

标签: tensorflow keras conv-neural-network lstm

我正在尝试神经网络的体系结构,并尝试将2D卷积连接到张量流Keras中的LSTM细胞。

这是我的原始模型:

model = Sequential()

model.add(CuDNNLSTM(256, input_shape=(train_x.shape[1:]), return_sequences=True))
model.add(Dropout(0.2))
model.add(BatchNormalization())

model.add(Dense(64, activation='relu'))
model.add(Dropout(0.2))

model.add(Dense(4, activation='softmax'))

它就像魔术一样。

train_x是1209个序列,每个集合有23个数字,序列长128。换句话说-它的形状是(1209,128,23)。该模型的输入等于 train_x.shape [1:] =(128,23)。

现在,我打算在LSTM单元之前添加256密集层,将其整形为16x16大小,添加一个2D卷积,将其展平并连接到LSTM单元。 (与LSTM单元之后的层相同)。

我从:

开始
model = Sequential()

model.add(Dense(256, input_shape=(train_x.shape[1:]), activation='relu'))
model.add(Dropout(0.2))

model.add(Reshape((16, 16), input_shape=(256,)))
model.add(Conv2D(16, (5,5), activation='relu', padding='same', input_shape=(16,16,1)))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Flatten())

model.add(CuDNNLSTM(256, return_sequences=True))
model.add(Dropout(0.1))
model.add(BatchNormalization())

model.add(Dense(64, activation='relu'))
model.add(Dropout(0.2))

model.add(Dense(4, activation='softmax'))

我有两个错误:

Input 0 of layer conv2d is incompatible with the layer: expected ndim=4, found ndim=3. Full shape received: [None, 16, 16]

当我删除卷积并仅保留“重塑”和“展平”层时:

tensorflow.python.framework.errors_impl.InvalidArgumentError: Input to reshape is a tensor with 4194304 values, but the requested shape has 32768
 [[{{node reshape/Reshape}} = Reshape[T=DT_FLOAT, Tshape=DT_INT32, _class=["loc:@training/Adam/gradients/dropout/cond/Merge_grad/cond_grad"], _device="/job:localhost/replica:0/task:0/device:GPU:0"](dropout/cond/Merge, reshape/Reshape/shape)]]

您知道如何处理吗?

0 个答案:

没有答案