在我的Keras模型中,我使用的是TimeDistributed包装器,但我一直遇到形状不匹配错误。以下是图层:
r_input = Input(shape=(100,), dtype='int32')
embedded_sequences = embedding_layer(r_input)
r_lstm = Bidirectional(GRU(100, return_sequences=True))(embedded_sequences)
r_dense = TimeDistributed(Dense(200))(r_lstm)
r_att = AttLayer()(r_dense)
sentEncoder = Model(r_input, r_att)
input = Input(shape=(100,15), dtype='int32')
encoder = TimeDistributed(sentEncoder)(input)
l_lstm = Bidirectional(GRU(100, return_sequences=True))(encoder) #-->>ERROR MESSAGE
我收到错误消息
ValueError:输入0与图层time_distributed_4不兼容: 预期ndim = 3,发现ndim = 4
如何调整编码器的形状以匹配双向层的输入?