TypeError:将文件上传到BQ时,预期的str,字节或os.PathLike对象,而不是NoneType

时间:2019-02-26 00:43:36

标签: python google-bigquery gcloud

我不确定这是什么问题。凭证有问题吗?我正在尝试将数据从GCP插入Google BigQuery。这是完整的错误:

Traceback (most recent call last):
  File "target.py", line 98, in <module>
main()
  File "target.py", line 94, in main
insert_data(gcs_file)
  File "target.py", line 85, in insert_data
bq = BigQueryClient(project)
  File "/Users/xxx/Prog/emr-etl/xx_auth.py", line 58, in BigQueryClient
credentials = Credentials.from_service_account_file(os.getenv('GOOGLE_APPLICATION_CREDENTIALS'))
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/oauth2/service_account.py", line 209, in from_service_account_file
filename, require=['client_email', 'token_uri'])
  File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/auth/_service_account_info.py", line 71, in from_filename
with io.open(filename, 'r', encoding='utf-8') as json_file:
TypeError: expected str, bytes or os.PathLike object, not NoneType

这是代码:

def upload_files(files, gcs_bucket="tracker"):

    storage_client = storage.Client(project='xxx-main')
    bucket = storage_client.get_bucket("tracker")

    for file in files:
        destination_filepath = file['folder'] + '/' + file['filename']
        source_filepath = file['local_filename']
        gcs_file = bucket.blob(destination_filepath)
        gcs_file.upload_from_filename(source_filepath)
    return gcs_file


def insert_data(gcs_file, project="xxx-main"):
    bq = BigQueryClient(project)
    bq_job_config = QueryJobConfig()
    job = bq.load_table_from_uri(gcs_file, 'snowplow', job_config=bq_job_config)
    result = job.result()


def main():
    lists = list_download(sp_bucket)
    gcs_file = upload_files(lists)
    insert_data(gcs_file)


if __name__ == "__main__":
    main()

0 个答案:

没有答案