大查询Python API csv加载错误

时间:2017-05-04 14:29:13

标签: python google-bigquery

我想使用python模块(google-cloud-bigquery == 0.24.0),我想写一些csv文件,它是tab分隔为大查询。不幸的是我收到以下错误。我缺少的是什么,我在函数中定义了source_format和field_delimeter属性。请帮忙。

def load_data_from_gcs(dataset_name, table_name, source):
       bigquery_client = bigquery.Client()
       dataset = bigquery_client.dataset(dataset_name)
       table = dataset.table(table_name)
       job_name = str(uuid.uuid4())
       job = bigquery_client.load_table_from_storage(
       job_name , table, source)
       job.print_header = True
       job.source_format = 'CSV'
       job.field_delimeter = '\t'
       job.begin()
       wait_for_job(job)
       print('Loaded {} rows into {}:{}'.format(
            job.output_rows, dataset_name, table_name))

Traceback (most recent call last):
File "bigquery_discrepancy_task.py", line 70, in <module>
load_data_from_gcs('test_data_set_gamehouse', 'name_surname' , dst_uri)
File "bigquery_discrepancy_task.py", line 53, in load_data_from_gcs
job.begin()
File "/home/ilker/venv/local/lib/python2.7/site-
packages/google/cloud/bigquery/job.py", line 320, in begin
method='POST', path=path, data=self._build_resource())
File "/home/ilker/venv/local/lib/python2.7/site-
packages/google/cloud/_http.py", line 294, in api_request
data = json.dumps(data)
File "/usr/lib/python2.7/json/__init__.py", line 243, in dumps
return _default_encoder.encode(obj)
File "/usr/lib/python2.7/json/encoder.py", line 207, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/usr/lib/python2.7/json/encoder.py", line 270, in iterencode
return _iterencode(o, 0)
File "/usr/lib/python2.7/json/encoder.py", line 184, in default
raise TypeError(repr(o) + " is not JSON serializable")
TypeError: gs://mybucket/sample_file.csv is not JSON 
serializable

0 个答案:

没有答案