添加多个隐藏层keras

时间:2018-08-21 09:40:36

标签: python-3.x keras nlp deep-learning sentiment-analysis

我有一个使用keras的简单情绪分析器,这是我的代码,其中我在github上使用keras代码:https://github.com/keras-team/keras/blob/master/examples/imdb_lstm.py

最初的工作模型是:

from __future__ import print_function

from keras.preprocessing import sequence
from keras.models import Sequential
from keras.layers import Dense, Embedding, Activation
from keras.layers import GRU, LeakyReLU
from keras.datasets import imdb

max_features = 2000
maxlen = 80  # cut texts after this number of words (among top max_features most common words)
batch_size = 256
hidden_layer_size = 32
dropout = 0.2
num_epochs = 1
activation_func = LeakyReLU(alpha=0.5)

print('Loading data...')
(x_train, y_train), (x_test, y_test) = imdb.load_data(num_words=max_features)
print(len(x_train), 'train sequences')
print(len(x_test), 'test sequences')

print('Pad sequences (samples x time)')
x_train = sequence.pad_sequences(x_train, maxlen=maxlen)
x_test = sequence.pad_sequences(x_test, maxlen=maxlen)
print('x_train shape:', x_train.shape)
print('x_test shape:', x_test.shape)

print('Build model...')
model = Sequential()
model.add(Embedding(max_features, hidden_layer_size))
model.add(GRU(hidden_layer_size, dropout=dropout, recurrent_dropout=dropout))
model.add(Activation(activation_func))
model.add(Dense(1, activation='sigmoid'))

# try using different optimizers and different optimizer configs
model.compile(loss='binary_crossentropy',
              optimizer='adam',
              metrics=['accuracy'])

print('Train...')
model.fit(x_train, y_train,
          batch_size=batch_size,
          epochs=num_epochs,
          validation_data=(x_test, y_test))
score, acc = model.evaluate(x_test, y_test,
                            batch_size=batch_size)
print('Test score:', score)
print('Test accuracy:', acc)

我得到的错误是这个:

  

ValueError:输入0与gru_2层不兼容:预期ndim = 3,   找到ndim = 2

每当我尝试在模型上添加第二个隐藏层时都会发生这种情况,例如:

model = Sequential()
model.add(Embedding(max_features, hidden_layer_size))
model.add(GRU(hidden_layer_size, dropout=dropout, recurrent_dropout=dropout))
model.add(Activation(activation_func))
model.add(GRU(hidden_layer_size, dropout=dropout, recurrent_dropout=dropout))
model.add(Activation(activation_func))
model.add(Dense(1, activation='sigmoid'))

我相信我缺少有关隐藏层尺寸的信息。我应该如何继续成功添加另一个隐藏层?

预先感谢

1 个答案:

答案 0 :(得分:6)

这是因为默认情况下,Keras中的RNN层仅返回最后一个输出,即输入(samples, time_steps, features)变为(samples, hidden_layer_size)。为了链接多个RNN,您需要将隐藏的RNN图层设置为具有return_sequences=True

model = Sequential()
model.add(Embedding(max_features, hidden_layer_size))
# Add return_sequences=True
model.add(GRU(hidden_layer_size, activation=activation_func, dropout=dropout, recurrent_dropout=dropout, return_sequences=True))
# (samples, time_steps, hidden_layer_size)
model.add(GRU(hidden_layer_size, activation=activation_func, dropout=dropout, recurrent_dropout=dropout))
# (samples, hidden_layer_size)
model.add(Dense(1, activation='sigmoid'))

您还可以返回上一个隐藏的等,看看documentation这些参数的作用。