使用Keras建立Logistic回归

时间:2020-03-08 19:46:00

标签: python-3.x tensorflow keras deep-learning word-embedding

我正在尝试创建单词嵌入,但是当我将数据拟合到模型(这只是具有嵌入层的逻辑回归模型)时,出现以下错误

检查目标时出错:预期density_29具有3维,但数组的形状为(59568180,1)

当我只想预测概率时,我不明白为什么最后一个传感器单元在输出中期望3维。

输入数组X的形状为(59568180,1,2),而输出Y的形状为(59568180,)

这是我的代码

def BuildModel(vocab_size, emb_size, window_size):
    model = Sequential([
        Flatten(input_shape=(1,2)),
        Embedding(output_dim=emb_size, input_dim=vocab_size),
        Dense(1, input_shape=(2,))])
    return model

def TrainModel(X_train, Y_train, vocab_size, emb_size = 300, window_size = 3, epochs = 1, optimizer = 'adam'):
    model = BuildModel(vocab_size, emb_size, window_size)
    model.compile(optimizer= optimizer,
                  loss='binary_crossentropy',
                  metrics=['accuracy'])
    model.fit(X_train, Y_train, epochs=epochs)

TrainModel(X_train, Y_train, len(word2index_vocab))

0 个答案:

没有答案