我正在尝试使用一本书作为输入来创建用于字符识别和预测的RNN。在本地计算机上运行每个纪元需要花费几分钟,因此我尝试在GCP上运行它。
在Google Cloud Platform上执行代码时出现以下错误。但是当我使用Spyder3在本地计算机上尝试时,代码正常执行。
# Character Prediction using RNN
# Small LSTM Network to Generate Text for Alice in Wonderland
import numpy
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout
from keras.layers import LSTM
from keras.callbacks import ModelCheckpoint
from keras.utils import np_utils
# load ascii text and covert to lowercase
filename = "Alice in Wonderland.txt"
raw_text = open(filename).read()
raw_text = raw_text.lower()
# create mapping of unique chars to integers
chars = sorted(list(set(raw_text)))
char_to_int = dict((c, i) for i, c in enumerate(chars))
# summarize the loaded data
n_chars = len(raw_text)
n_vocab = len(chars)
print ("Total Characters: ", n_chars)
print ("Total Vocab: ", n_vocab)
# prepare the dataset of input to output pairs encoded as integers
seq_length = 100
X_train = []
y_train = []
for i in range(0, n_chars - seq_length, 1):
seq_in = raw_text[i:i + seq_length]
seq_out = raw_text[i + seq_length]
X_train.append([char_to_int[char] for char in seq_in])
y_train.append(char_to_int[seq_out])
n_patterns = len(X_train)
print ("Total Patterns: ", n_patterns)
# reshape X to be [samples, time steps, features]
X = numpy.reshape(X_train, (len(X_train), seq_length, 1))
# normalize
X = X / float(n_vocab)
# one hot encode the output variable
y = np_utils.to_categorical(y_train)
# define the LSTM model
model = Sequential()
model.add(LSTM(256, input_shape=(X.shape[1], X.shape[2])))
model.add(Dropout(0.2))
model.add(Dense(y.shape[1], activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam')
# define the checkpoint
filepath="weights-improvement-{epoch:02d}-{loss:.4f}.hdf5"
checkpoint = ModelCheckpoint(filepath, monitor='loss', verbose=1, save_best_only=True, mode='min')
callbacks_list = [checkpoint]
# fit the model
model.fit(X, y, epochs=20, batch_size=128, callbacks=callbacks_list)
在创建LSTM时,在以下行上发生错误:
model.add(LSTM(256, input_shape=(X.shape[1], X.shape[2])))
这是错误:
文件“ /root/anaconda3/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py”,行2957,在rnn中 maximum_iterations = input_length)
TypeError:while_loop()收到了意外的关键字参数“ maximum_iterations”
答案 0 :(得分:2)
在本地计算机上运行时,我遇到了类似的问题。以下是我遵循的步骤
我的conda环境名称是TESTENV
使用
登录或进入您的conda环境 source activate TESTENV
检查是否在conda环境中已经安装了pip,否则进行安装
conda install pip
安装TensorFlow版本1.4.1
pip install --ignore-installed --upgrade https://storage.googleapis.com/tensorflow/mac/cpu/tensorflow-1.4.1-py2-none-any.whl
安装Keras版本2.1.2
conda install keras=2.1.2
答案 1 :(得分:0)
conda remove keras*
conda install keras-gpu==2.1.6
解决了我同样的问题。
答案 2 :(得分:0)
尝试升级您的张量流,问题将得到解决。
CPU:pip install –upgrade tensorflow
GPU:pip install –upgrade tensorflow-gpu