我从我的机器学习模型中创建了一个泡菜,并将其保存在本地。我想将其推送到蔚蓝的Blob存储中,并希望稍后再检索。如何使用python 3进行操作。请帮忙。
'''
#model
X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size = 0.3, random_state = 100)
regressor = LinearRegression()
regressor.fit(X_train, Y_train)
#Creating the pickle file finalized_model.sav on my local
mypickle = 'finalized_model.sav'
pickle.dump(regressor, open(mupickle, 'wb'))
'''
I tried following for .csv importing and pushing from azure, but don't know how to do with pickle files.
'''
with BytesIO() as input_blob:
block_blob_service = BlockBlobService(account_name='*****',
account_key='*********************************************************************')
block_blob_service.get_blob_to_stream('blobcontainer', 'claims.csv', input_blob)
input_blob.seek(0)
dataframe_blobdata = pd.read_csv(input_blob)
#transforming the data in between
output = dataframe_blobdata.to_csv (index_label="idx", encoding = "utf-8")
block_blob_service.create_blob_from_text('secondforblobcontainer', 'OutFilePy.csv', output)
'''
答案 0 :(得分:0)
根据我的理解,您只想将名称为finalized_model.sav
的图片文件上传到Azure存储。
然后,我建议您使用azure-storage-blob
SDK来上传Blob。这是一个官方示例:Code examples。
详细来说,首先需要从门户网站获取存储帐户的连接字符串,然后使用该连接字符串创建一个BlobServiceClient
,然后可以通过blob_service_client.create_container(container_name)
创建一个容器客户端。
最后,您可以创建一个Blob客户端,然后根据其路径上传本地文件。
从Azure存储中下载回来也很容易。 downloading blobs的示例:
with open(download_file_path, "wb") as download_file:
download_file.write(blob_client.download_blob().readall())