无论神经元数量或批量大小多少,我都会在没有来自keras的model.fit()的其他解释的情况下继续获取MemoryError。有谁知道这个错误引用的错误或如何解决这个问题?
错误:
Using TensorFlow backend.
Traceback (most recent call last):
File "C:/Users/ideapad/Dropbox/TA/preprocessTA/kerass.py", line 32, in <module>
model.fit(np.array(fd.dataTrain), np.array(fd.outputTrain), batch_size=batch_size, epochs=100, verbose=1)
MemoryError
代码:
import fetchData as fd
import numpy as np
from keras.models import Sequential
from keras.layers import Dense, Activation, LSTM, Dropout, Embedding
model = Sequential()
model.add(Embedding(num_input, num_output))
model.add(LSTM(neuron_1))
model.add(Dropout(dropout))
model.add(Dense(num_output))
model.add(Activation(activation_function))
model.compile(loss=loss_function, optimizer=optimizer_function, metrics=['mae'])
model.fit(np.array(fd.dataTrain), np.array(fd.outputTrain), batch_size=batch_size, epochs=100, verbose=1)
score = model.evaluate(np.array(fd.dataTest), np.array(fd.outputTest))
print(score)
答案 0 :(得分:0)
numpy出错,使用np.asarray时解决了。谢谢@MatiasValdnegro和@Idavid。