我正在尝试使用2个LSTM训练条件编码架构。但是,当我将嵌入作为输入传递到fit
模块时,出现错误。这基本上是一个多输入模型。
sequence_length = X_train.shape[1]
sequence_lengthP = Xprev.shape[1]
embedding_layerP = Embedding(vocab_sizeP, 300, weights=[embedding_matrixP], input_length=maxlen, trainable=False)
inputsP = Input(shape=(sequence_lengthP,))
embeddingP = embedding_layerP(inputsP)
reshapeP = Reshape((sequence_lengthP, EMBEDDING_DIM, 1))(embeddingP)
embedding_layer = Embedding(vocab_size, 300, weights=[embedding_matrix], input_length=maxlen, trainable=False)
inputs = Input(shape=(sequence_length,))
embedding = embedding_layer(inputs)
reshape = Reshape((sequence_length, EMBEDDING_DIM, 1))(embedding)
prev_input = Input(shape=(sequence_lengthP,300,),name='prev_input')
prev_lstm = LSTM(64, name='prev_lstm', return_state=True)
prev_output, state_h, state_c = prev_lstm(prev_input)
final_prev_states = [state_h, state_c]
main_input = Input(shape=(sequence_length,300,),name='main_input')
main_lstm = LSTM(64, name='main_lstm', return_state=True)
main_outputs, _, _ = main_lstm(main_input, initial_state=final_prev_states)
dense = Dense(100, activation='relu', name='dense')(main_outputs)
output = Dense(3, activation='softmax', name='output')(dense)
model_cond = Model(inputs=[prev_input, main_input], outputs=output)
model_cond.compile(optimizer='rmsprop', loss='categorical_crossentropy', metrics=['accuracy'])
history = model_cond.fit([embeddingP, embedding], y_train, epochs=150, verbose=1, validation_data=(X_val, y_val), shuffle=False) # starts training
错误是:
ValueError: When feeding symbolic tensors to a model, we expect the tensors to have a static batch size. Got tensor with shape: (None, 27, 300)
我的embedding
和embeddingP
的形状为(?,27,300)。您是否知道我应该将什么作为输入传递给fit
模块?