如何将Keras Regressor模型从Colab保存到Google云端硬盘?

时间:2020-10-15 22:29:29

标签: python tensorflow keras google-colaboratory

我想在Google云端硬盘中学习后保存回归模型。我最近开始探索机器学习,发现Colab有用,但是无法上传模型。我有本地计算机的python代码,但不确定如何将其上传到Google驱动器中。

修改:我尝试这样做

from google.colab import drive
drive.mount('/content/gdrive')

name  = 'dll_28'
addr = '/content/drive/My Drive/learning/'

但是当我运行代码时,它会出现以下错误

"Transport endpoint is not connected: '/content/drive/My Drive/learning/model_dll_28.json'"
sc_X = MinMaxScaler(feature_range=(-1,1))
sc_y = StandardScaler()

sX_train = sc_X.fit_transform(X_train)
sy_train = sc_y.fit_transform(y_train)
sX_test  = sc_X.transform    (X_test )
sy_test  = sc_y.transform    (y_test )




#==================================================================
#********************  Learning  **********************************
#==================================================================

# Importing the Keras libraries and packages
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras import losses
from tensorflow.keras import optimizers
import tensorflow.keras.initializers as init


# Initialising the ANN
reg = Sequential()
reg.add(Dense(units = 32, kernel_initializer = init.glorot_uniform(), activation = 'relu', input_dim = 6))
reg.add(Dense(units = 64, kernel_initializer = init.glorot_uniform(), activation = 'relu'))
reg.add(Dense(units = 128, kernel_initializer = init.glorot_uniform(), activation = 'relu'))
reg.add(Dense(units = 32, kernel_initializer = init.glorot_uniform(), activation = 'relu'))
reg.add(Dense(units = 2, kernel_initializer = init.glorot_uniform(), activation = 'linear'))

# Compiling the ANN
reg.compile(optimizer = optimizers.Adam(lr=0.001),loss = losses.mse)


# Fitting the ANN to the Training set
reg.fit(sX_train, sy_train, batch_size = 256, epochs = 500)

score = reg.evaluate(sX_test, sy_test, batch_size = 250)
print(score)
#plt.hist(y[:,2]);

#==================================================================
#********************  Saving the regressor  **********************
#==================================================================

import pickle
name  = 'dll_28'
addr = homefolder + '/reg_files/dll_28/'

reg_json=reg.to_json()
with open(addr+'model_'+name+'.json', "w") as json_file:
    json_file.write(reg_json)
reg.save_weights(addr+'reg_'+name+'.h5')  

from sklearn.externals import joblib
joblib.dump(sc_X, addr+'scX_'+name+'.pkl') 
joblib.dump(sc_y, addr+'scY_'+name+'.pkl')
pickle.dump( reg.get_weights(), open( addr+'w8_'+name+'.p', "wb" ) )

1 个答案:

答案 0 :(得分:0)

您可以使用Google Storage来代替Google云端硬盘。要与Google Storage进行交互,您可以使用TensorFlow io utils。例如: tf.io.gfile.GFile

要在笔记本的Google Cloud中设置对您帐户的访问凭据,可以使用以下逻辑:

from google.colab import auth

auth.authenticate_user()

默认情况下,tf.keras.Model不支持直接读取或写入gs://,因此有两种选择。您将其保存在本地,然后通过读取并使用tf.io.gfile.GFile将其写入Google Storage进行复制。或者,您可以在keras库中应用已实现的here修复程序,在tensorflow版本中尚不可用。

相关问题