将大型csv(或其他类型)文件导入BigQuery时,我们如何获得导入的进度?例如,如果我们有1TB文件并使用import csv命令,我不仅要等待十个小时才能导入文件。我们怎样才能取得进展,或者这是不可能的?
现在,我们无法在加载csv文件之前获取它
关于进度条:
在任务正在进行时,永远不会返回Load Task特定统计信息。统计信息仅包含开始/结束时间,Java API将其解析为CopyStatistics类。
{
"kind": "bigquery#job",
"etag": "\"smpMas70-D1-zV2oEH0ud6qY21c/crKHebm6x2NXA6pCjE8znB7dp-E\"",
"id": "YYY:job_l9TWVQ64YjKx7BgDufu2gReMEL0",
"selfLink": "https://www.googleapis.com/bigquery/v2/projects/YYY/jobs/job_l9TWVQ64YjKx7BgDufu2gReMEL0",
"jobReference": {
"projectId": "YYY",
"jobId": "job_l9TWVQ64YjKx7BgDufu2gReMEL0"
},
"configuration": {
"load": {
"sourceUris": [
"gs://datadocs/afdfb50f-cbc2-47d4-985e-080cadefc963"
],
"schema": {
"fields": [
...
]
},
"destinationTable": {
"projectId": "YYY",
"datasetId": "1aaf1682dbc2403e92a0a0ed8534581f",
"tableId": "ORIGIN"
},
"createDisposition": "CREATE_IF_NEEDED",
"writeDisposition": "WRITE_EMPTY",
"fieldDelimiter": ",",
"skipLeadingRows": 1,
"quote": "\"",
"maxBadRecords": 1000,
"allowQuotedNewlines": true,
"sourceFormat": "CSV"
}
},
"status": {
"state": "RUNNING"
},
"statistics": {
"creationTime": "1490868448431",
"startTime": "1490868449147"
},
"user_email": "YYY@appspot.gserviceaccount.com"
}
只有在导入整个CSV文件时才会返回加载统计信息。
我们如何在上传过程中取得进展?
答案 0 :(得分:1)