如何使用 keras 将注意力层添加到 CNN_BLSTM 模型?

时间:2021-05-11 12:56:12

标签: keras deep-learning conv-neural-network lstm

def cnn_blsm():
     # define CNN model
    model = Sequential()
    
    model.add(TimeDistributed(Conv2D(20, (3,3), activation='tanh',padding = 'same'), input_shape=(1,11,11,1)))
    model.add(TimeDistributed(MaxPooling2D(pool_size=(2, 2))))
    model.add(TimeDistributed(Conv2D(40, (3,3), activation='tanh',padding = 'same')))
    model.add(TimeDistributed(MaxPooling2D(pool_size=(2, 2))))       
    model.add(TimeDistributed(Conv2D(60, (3,3), activation='tanh',padding = 'same')))
    model.add(TimeDistributed(MaxPooling2D(pool_size=(2, 2))))     
    model.add(TimeDistributed(Flatten()))
    model.add(Bidirectional(LSTM(40)))
#     model.add(Attention(use_scale=False))
#     model.add((Dense(1, activation='relu')))
    model.add(Dense(320, activation='relu'))
    model.add(Dropout(0.1))
# #     model.add(Dense(1024, activation='tanh'))
    model.add(Dense(1, activation='sigmoid'))
    
    return model

我正在尝试为我的模型添加一个注意力层,但每次看到示例时,我都会发现嵌入层、编码和解码机制,所以有没有办法在不使用它们的情况下添加它?

0 个答案:

没有答案