Pandas BigQuery上传卡在缓冲区中

时间:2016-05-20 13:48:46

标签: python pandas google-bigquery

我想使用to_gbq函数将Pandas数据框上传到Google BigQuery:

import pandas as pd
dfData = pd.DataFrame(llData, columns=lsHeadings)

sProjectID = dBQConfig['sProjectID']
sTargetDataset = dBQConfig['sTargetDataset']
sTargetTable = dBQConfig['sTargetTable']
sTablePath = "{}.{}".format(sTargetDataset, sTargetTable)

dfData.to_gbq(sTablePath, sProjectID, if_exists='replace')

python脚本成功运行,就像这样(在PySpark作业中运行):

Streaming Insert is 100% Complete

16/05/19 16:55:38 INFO SparkContext: Invoking stop() from shutdown hook
Process finished with exit code 0

然后他们的表出现在BigQuery中,然而,它看起来像这样:

enter image description here

它已经差不多24小时了。有什么建议吗?

0 个答案:

没有答案