从mongo导入到大查询时,会发生以下错误。我们有一个脚本,该脚本从s3上的mongo转储中准备数据(大约2.8GB),然后将其转换为“ NEWLINE_DELIMITED_JSON”。
直到最近,此脚本都可以正常工作,并且尚未更改。
有人知道如何解决此问题并找到引起问题的文档吗?
"status": {
"errorResult": {
"message": "Error while reading data, error message: JSON table encountered too many errors, giving up. Rows: 41081; errors: 1. Please look into the errors[] collection for mor
e details.",
"reason": "invalid"
},
"errors": [
{
"message": "Error while reading data, error message: JSON table encountered too many errors, giving up. Rows: 41081; errors: 1. Please look into the errors[] collection for m
ore details.",
"reason": "invalid"
},
{
"message": "Error while reading data, error message: JSON processing encountered too many errors, giving up. Rows: 41081; errors: 1; max bad: 0; error percent: 0",
"reason": "invalid"
},
{
"message": "Error while reading data, error message: JSON parsing error in row starting at position 2890606042: Parser terminated before end of string",
"reason": "invalid"
}
],
"state": "DONE"
答案 0 :(得分:0)
请注意数据。我遇到了同样的问题,结果是一个字段具有NaN值,这对于我们的应用程序(在Python / TS中)是可以的,但对于BigQuery则不行。