如何使用巴赫达瑙注意力进行时间序列预测?

时间:2020-10-04 03:50:15

标签: tensorflow keras deep-learning time-series lstm

我们可以使用Bahdanau注意解决多元时间序列预测问题吗?使用here中的Bahdanau实现,我想出了以下代码来进行时间序列预测。

    from tensorflow.keras.layers import Input, LSTM, Concatenate, Flatten
    from attention_keras import AttentionLayer
    from tensorflow.keras import Model

    num_inputs = 5
    seq_length = 10
    inputs = Input(shape=(seq_length, num_inputs), name='inputs')
    lstm_out = LSTM(64, return_sequences=True)(inputs)
    lstm_out = LSTM(64, return_sequences=True)(lstm_out)

    # Attention layer
    attn_layer = AttentionLayer(name='attention_layer')
    attn_out, attn_states = attn_layer([lstm_out, lstm_out])

    # Concat attention input and LSTM output, in original code it was decoder LSTM
    concat_out = Concatenate(axis=-1, name='concat_layer')([lstm_out, attn_out])
    flat_out = Flatten()(concat_out)

    # Dense layer
    dense_out = Dense(seq_length, activation='relu')(flat_out)
    predictions= dense_time(1)(dense_out)

    # Full model
    full_model = Model(inputs=inputs, outputs=predictions)
    full_model.compile(optimizer='adam', loss='mse')

就我的数据而言,在没有引起注意的情况下,该模型的性能确实优于普通LSTM,但是我不确定该实现是否有意义?

0 个答案:

没有答案