Keras-嵌入层输入错误和相应的input_length错误

时间:2019-12-20 13:12:28

标签: keras nlp encoder-decoder

使用Input时遇到错误,其中Embedding是我的第一层。尽管我在(,9)中清楚地提到了形状,但找不到Input()形状的张量。有人可以帮我吗?

代码如下:

def model_3(src_vocab, tar_vocab, src_timesteps, tar_timesteps, n_units):

    _nput = Input(shape=[src_timesteps], dtype='int32')
    embedding = Embedding(input_dim = src_vocab, output_dim = n_units, input_length=src_timesteps, mask_zero=False)(_nput)
    activations = LSTM(n_units, return_sequences=True)(embedding)
    attention = Dense(1, activation='tanh')(activations)
    attention = Flatten()(attention)
    attention = Activation('softmax')(attention)
    attention = RepeatVector(tar_timesteps)(attention)
    activations = Permute([2,1])(activations)
    sent_representation = dot([attention,activations], axes=-1)
    sent_representation = LSTM(n_units, return_sequences=True)(sent_representation)
    sent_representation = TimeDistributed(Dense(tar_vocab, activation='softmax'))(sent_representation)
    model = Model(input=_nput,output=sent) 
    model.compile(optimizer='adam', loss='categorical_crossentropy')
    print(model.summary())
    plot_model(model, to_file='model.png', show_shapes=True)

0 个答案:

没有答案