Keras注意层

时间:2019-10-28 16:16:23

标签: keras attention-model

我想在这里使用注意力层的注意力权重/分数顺序 https://www.kaggle.com/qqgeogor/keras-lstm-attention-glove840b-lb-0-043

所以我可以将它们用于其他LSTM,例如:

comment_input = Input(shape=(MAX_SEQUENCE_LENGTH,), dtype='int32')
embedded_sequences= embedding_layer(comment_input)
x = lstm_layer(embedded_sequences)
x = Dropout(rate_drop_dense)(x)
x= Attention(MAX_SEQUENCE_LENGTH)(x)
x = lstm_layer(x)
merged = Dense(num_dense, activation=act)(x)
merged = Dropout(rate_drop_dense)(merged)
merged = BatchNormalization()(merged)
preds = Dense(6, activation='sigmoid')(merged)

我想知道如何仅获得序列的关注权重而不是单个向量,以便在关注层之后也可以使用LSTM?

0 个答案:

没有答案