有没有办法使用Google Prediction API(大约100 MB)批量预测大量数据?我看到可以使用存储在Google云端存储上的数据来训练模型。是否可以根据存储在Google云端存储上的文件进行预测?
答案 0 :(得分:0)
是的,这很有效。我下面有一个Python代码示例。
# Get predefined credentials (see https://cloud.google.com/sdk/gcloud/#gcloud.auth)
http = AppAssertionCredentials('https://www.googleapis.com/auth/prediction').authorize(httplib2.Http())
# Create a service for the Prediction API
service = build('prediction', 'v1.6', http=http)
# Define body to create a new model with a URL to the CSV files stored in GCP Cloud Storage
request_body = {"id": "<ModelID>", "storageDataLocation": <bucket_with_filename>}
# Execute command
return service.trainedmodels().insert(project='<ProjectID>', body=request_body).execute()