张量流模型载荷问题

时间:2018-10-23 20:25:38

标签: python-3.x tensorflow machine-learning

我正在研究tensorflow并试图保存和加载模型。模型位于下面的文件中

model = keras.Sequential()
model.add(keras.layers.Dense(785, activation ='sigmoid' ))
model.add(keras.layers.Dense(25, activation = 'sigmoid'))
model.add(keras.layers.Dense(10, activation = 'sigmoid'))

model.compile(optimizer=tf.train.GradientDescentOptimizer(0.01),
          loss='mse',
          metrics=['mae'])

model.fit(X,Y,epochs = 20, callbacks=[history])
f = h5py.File(r'C:\Users\akash\Desktop\Big Data\Model\model1', "w")
tf.keras.models.save_model(
    model,
    f,
    overwrite=True,
    include_optimizer=True
)

我的加载文件如下

model1 = tf.keras.models.load_model(
r'C:\Users\akash\Desktop\Big Data\Model\model1',
custom_objects=None,
compile=True
)
model1.compile(optimizer=tf.train.GradientDescentOptimizer(0.01),
          loss='mse',
          metrics=['mae'])

我不得不再次编译我的模型,因为tensorflow要求您这样做,并且不允许保存优化器

并且由于这个原因我收到以下错误

Using TensorFlow backend.
WARNING:tensorflow:No training configuration found in save file: the model was *not* compiled. Compile it manually.
Traceback (most recent call last):
  File "C:/Users/akash/Desktop/Big Data/scripts/load_model.py", line 21, in <module>
metrics=['mae'])
  File "C:\Python\lib\site-packages\tensorflow\python\training\checkpointable\base.py", line 426, in _method_wrapper
method(self, *args, **kwargs)
  File "C:\Python\lib\site-packages\tensorflow\python\keras\engine\training.py", line 525, in compile
metrics, self.output_names)
AttributeError: 'Sequential' object has no attribute 'output_names'

1 个答案:

答案 0 :(得分:0)

也许这可以帮助您:

# MLP for Pima Indians Dataset Serialize to JSON and HDF5
from keras.models import Sequential
from keras.layers import Dense
from keras.models import model_from_json
import numpy
import os
# fix random seed for reproducibility
numpy.random.seed(7)
# load pima indians dataset
dataset = numpy.loadtxt("pima-indians-diabetes.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]
# create model
model = Sequential()
model.add(Dense(12, input_dim=8, kernel_initializer='uniform', activation='relu'))
model.add(Dense(8, kernel_initializer='uniform', activation='relu'))
model.add(Dense(1, kernel_initializer='uniform', activation='sigmoid'))
# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
# Fit the model
model.fit(X, Y, epochs=150, batch_size=10, verbose=0)
# evaluate the model
scores = model.evaluate(X, Y, verbose=0)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

# serialize model to JSON
model_json = model.to_json()
with open("model.json", "w") as json_file:
    json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("model.h5")
print("Saved model to disk")

# later...

# load json and create model
json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("model.h5")
print("Loaded model from disk")

# evaluate loaded model on test data
loaded_model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
score = loaded_model.evaluate(X, Y, verbose=0)
print("%s: %.2f%%" % (loaded_model.metrics_names[1], score[1]*100))