ValueError:输入0与图层flatten_1不兼容

时间:2019-04-08 15:35:25

标签: python keras deep-learning lstm

我正在尝试将CNN和LSTM结合起来,并且正在研究以下代码(我没有遵循任何教程):

def create_model(sequence_len_med,vocabulary_size_description,vocabulary_size_med):

    embedding_weights1 = train_word2vec(x_train1, vocabulary_inv, num_features=embedding_dim,min_word_count=min_word_count, context=context)

    embedding_weights2 = train_word2vec(x_train2, vocab_inv, num_features=embedding_dim,min_word_count=min_word_count, context=context)

    input_shape = (sequence_len_med,)

    model_input = Input(shape=input_shape)
    layer = Embedding(len(vocabulary_inv), embedding_dim,input_length=sequence_len_med, name="embedding")(model_input)

    layer = Dropout(dropout_prob[0])(layer)

    #Input 1 : drugs
    model_input_med = Input(shape=input_shape)
    x1=Embedding(vocabulary_size_med,100)(model_input_med)
    conv1=Conv1D(filters=32, kernel_size=4, activation='relu')(x1)
    pool1=AveragePooling1D(pool_size=2)(conv1)
    lstm1=LSTM(100)(pool1)
    flat1=Flatten()(lstm1)

    #Input 2 : condition
    model_input_cond = Input(shape=input_shape)
    x2=Embedding(vocabulary_size_description,100,model_input_cond)
    conv2=Conv1D(filters=32, kernel_size=4, activation='relu')(x2)
    pool2=AveragePooling1D(pool_size=2)(conv2)
    lstm2=LSTM(100)(pool2)
    flat2=Flatten()(lstm2)

    #Merging
    merge=concatenate([flat1,flat2])
    dense=Dense(10,activation='relu')(merge)
    OUTPUT=Dense(1,activation='softmax')(dense)
    model=Model(input=[model_input_cond,model_input_med],outputs=OUTPUT)

    layer.set_weights(embedding_weights1)
    layer.set_weights(embedding_weights2)

    model.compile(loss='mse',optimizer='adam',metrics=['mae','acc'])
    return model


model=create_model(sequence_len_med,vocabulary_size_description,vocabulary_size_med)
    model.fit([x_train1,x_train2],y,epochs=10,batch_size=16)

以下是形状:

model_input_med:TensorShape([Dimension(None), Dimension(57770)])

model_input_cond:TensorShape([Dimension(None), Dimension(57770)])

如果需要其他任何信息,请提及并提供。

0 个答案:

没有答案