在lstm编码器解码器模型中添加Batchnormalization层

时间:2019-04-07 11:07:25

标签: python lstm batch-normalization

我对如何向LSTM BatchNormalization模型添加encoder decoder层感兴趣。我有一个用于LSTM encoder decoder模型的代码,该代码可以进行时间序列预测。

num_features = X_train.shape[2]
# Define an input series and encode it with an LSTM. 
encoder_inputs = Input(shape=(None, num_features)) 
encoder = LSTM(units_size, return_state=True, dropout=dropout)
encoder_outputs, state_h, state_c = encoder(encoder_inputs)

# We discard `encoder_outputs` and only keep the final states. These represent the "context"
# vector that we use as the basis for decoding.
encoder_states = [state_h, state_c]

# Set up the decoder, using `encoder_states` as initial state.
# This is where teacher forcing inputs are fed in.
decoder_inputs = Input(shape=(None, 1)) 

# We set up our decoder using `encoder_states` as initial state.  
# We return full output sequences and return internal states as well. 
# We don't use the return states in the training model, but we will use them in inference.
decoder_lstm = LSTM(units_size, return_sequences=True, return_state=True, dropout=dropout)
decoder_outputs, _, _ = decoder_lstm(decoder_inputs,
                                     initial_state=encoder_states)

decoder_dense = Dense(1) # 1 continuous output at each timestep
decoder_outputs = decoder_dense(decoder_outputs)

# Define the model that will turn
# `encoder_input_data` & `decoder_input_data` into `decoder_target_data`
model = Model([encoder_inputs, decoder_inputs], decoder_outputs)
model.compile(Adam(lr = learning_rate), loss='mean_absolute_error')

我想将BatchNormalization层添加到解码器部分。但是我不知道我应该使用它。我将不胜感激。

0 个答案:

没有答案