获取由TimeDistributed层包装的嵌套中间模型的输出

时间:2020-02-26 12:37:48

标签: keras deep-learning neural-network

这个问题类似于https://datascience.stackexchange.com/questions/19362/how-to-obtain-output-of-intermediate-model-in-keras

我正在使用功能性API创建神经体系结构,如下所示:

input1 = Input(shape=(50, embed_dim))

dense_layer = Dense(embed_dim, activation='tanh')(input1)
softmax_layer = Activation('softmax', name='attention')(dense_layer)

attention_mul = multiply([softmax_layer,input1])
vec_sum = Lambda(lambda x: K.sum(x, axis=1))(attention_mul)

# Nested (Intermediate Model)
pre_model1 = Model(input1, vec_sum, name='news_attention')

input2 = Input(shape=(1, 50, embed_dim))
pre_cnn = TimeDistributed(pre_model1)(input2)
cnn = Conv1D(filters = 100, 
          kernel_size = 3, 
          padding = 'same',
          activation = 'relu')(pre_cnn)
flatten = Flatten()(cnn)
dropout = Dropout(0.5)(flatten)
final = Dense(hidden_dims, activation="relu")(dropout)

model = Model(input2, final)

现在,我要访问“ softmax_layer”的输出,该输出是嵌套模型 pre_model1 中的一层。我尝试遵循datascience stackexchange中的解决方案。但是,我无法像上面所说的那样在模型摘要中获得嵌套的模型对象( pre_model1 )。

model.summary()

model architecture

如何获取由TimeDistributedLayer包装的嵌套模型对象?

谢谢。

0 个答案:

没有答案