过去,我已成功将数据从美国托管的GCS存储桶中的CSV数据加载到美国托管的BigQuery数据集中。我们决定将我们的BigQuery数据移到欧盟,然后创建了一个新的数据集,并在其上选择了该区域。我已经成功地填充了我们的桌子,这些桌子足够小,可以从家里的机器上传。但是这两个表太大了,所以我想从GCS中的文件中加载它们。我尝试过从美国托管的GCS存储桶和欧盟托管的GCS存储桶(认为bq load
可能不想跨越区域),但负载每次都失败。下面是我从bq
命令行获得的错误详细信息(500,内部错误)。有谁知道为什么会发生这种情况?
{
"configuration": {
"load": {
"destinationTable": {
"datasetId": "######",
"projectId": "######",
"tableId": "test"
},
"schema": {
"fields": [
{
"name": "test_col",
"type": "INTEGER"
}
]
},
"sourceFormat": "CSV",
"sourceUris": [
"gs://######/test.csv"
]
}
},
"etag": "######",
"id": "######",
"jobReference": {
"jobId": "######",
"projectId": "######"
},
"kind": "bigquery#job",
"selfLink": "https://www.googleapis.com/bigquery/v2/projects/######",
"statistics": {
"creationTime": "1445336673213",
"endTime": "1445336674738",
"startTime": "1445336674738"
},
"status": {
"errorResult": {
"message": "An internal error occurred and the request could not be completed.",
"reason": "internalError"
},
"errors": [
{
"message": "An internal error occurred and the request could not be completed.",
"reason": "internalError"
}
],
"state": "DONE"
},
"user_email": "######"
}
答案 0 :(得分:1)
在StackOverflow上搜索其他相关问题后,我终于意识到我已将GCS存储区域设置为EUROPE-WEST-1
而不是多区域EU
位置。事情正在按预期发挥作用。