我想使用Google Functions + Google Scheduler将ETL脚本(我曾经在本地计算机上运行它们)转移到google cloud。 脚本将数据往返于Google Analytics(分析)(这是一个示例)。 密钥文件(.p12)的位置有问题。 我想将其放在Google存储空间中并为它指明方向。 当前KEY_FILE_LOCATION = r'c:/local_path/file.p12'。 连接到Google Analytics(分析):
liquibase
我想使用
def initialize_analyticsreporting():
credentials = ServiceAccountCredentials.from_p12_keyfile(
SERVICE_ACCOUNT_EMAIL, KEY_FILE_LOCATION, scopes=SCOPES)
http = credentials.authorize(httplib2.Http())
analytics = build('analytics', 'v4', http=http, discoveryServiceUrl=DISCOVERY_URI)
return analytics
但是我不明白如何正确地做到这一点。
答案 0 :(得分:0)
请注意,使用应用程序默认凭据的Cloud Functions API客户端库在运行时会自动从Cloud Functions主机获取内置服务帐户凭据。默认情况下,客户端使用YOUR_PROJECT_ID@appspot.gserviceaccount.com服务帐户进行身份验证。
因此,如果您可以将此服务帐户配置为对需要访问的资源具有适当的权限,则建议对其进行中继。
示例如何在python中upload file to the bucket。 示例如何在Python中download file from the bucket:
def download_blob(bucket_name, source_blob_name, destination_file_name):
"""Downloads a blob from the bucket."""
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(source_blob_name)
blob.download_to_filename(destination_file_name)
print('Blob {} downloaded to {}.'.format(
source_blob_name,
destination_file_name))
创建凭据from the file as described here:
def get_ga_service(bucket):
download_gcs_file('auth.json', '/tmp/auth.json', bucket)
credentials = ServiceAccountCredentials.from_json_keyfile_name(
'/tmp/auth.json',
scopes=['https://www.googleapis.com/auth/analytics',
'https://www.googleapis.com/auth/analytics.edit'])
# Build the service object.
return build('analytics', 'v3', credentials=credentials, cache_discovery=False)