在Keras中将我的代码与LSTM一起添加注意

时间:2019-05-08 17:25:06

标签: python tensorflow keras nlp

我想在我的LSTM代码中一起增加注意力,但收到以下错误。请帮我。我收到此错误消息:

  

ValueError:意外发现类型为<class 'keras.layers.recurrent.LSTM'>的实例。应该是符号张量实例。

embedding_layer = Embedding(nb_words, EMBEDDING_DIM, weights=[params['emb_matrix']],
        input_length=MAX_SEQUENCE_LENGTH, trainable=False)

lstm_layer = LSTM(512, dropout=0.2, recurrent_dropout=0.2)(embedding_layer)

sequence_1_input = Input(shape=(MAX_SEQUENCE_LENGTH,), dtype='int32')
embedded_sequences_1 = embedding_layer(sequence_1_input)
x1 = lstm_layer(embedded_sequences_1)
sequence_2_input = Input(shape=(MAX_SEQUENCE_LENGTH,), dtype='int32')
embedded_sequences_2 = embedding_layer(sequence_2_input)
y1 = lstm_layer(embedded_sequences_2)
merged = concatenate([x1, y1])
merged = Dropout(0.2)(merged)
merged = BatchNormalization()(merged)

attention = TimeDistributed(Dense(1, activation='tanh'))(lstm_layer) 
attention = Flatten()(attention)
attention = Activation('softmax')(attention)
attention = RepeatVector(512)(attention)
attention = Permute([2, 1])(attention)

sent_representation = merge([lstm_layer, attention], mode='mul')
sent_representation = Lambda(lambda xin: K.sum(xin, axis=1))(sent_representation)

merged = Dense(256, activation='relu')(merged)
merged = Dropout(0.2)(merged)
merged = BatchNormalization()(merged)

probabilities = Dense(1, activation='sigmoid')(sent_representation)
"""
merged = Dense(256, activation='relu')(merged)
merged = Dropout(0.2)(merged)
merged = BatchNormalization()(merged)

preds = Dense(1, activation='sigmoid')(merged)
"""
model = Model(inputs=[sequence_1_input, sequence_2_input], outputs=probabilities)
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
model.summary()
return model

0 个答案:

没有答案