使用命令行BQ命令从JSON文件导入数据时,将空字符串("")转换为Double数据类型

时间:2016-07-08 07:07:14

标签: google-bigquery gcloud

哪些步骤会重现此问题? 我正在运行命令:         ./bq load --source_format = NEWLINE_DELIMITED_JSON --schema = lifeSchema.json dataset_test1.table_test_3 lifeData.json 2.我附加了数据源文件和scema文件。 3.它会抛出一个错误 - 从文件位置0开始的行中的JSON解析错误: 文件-00000000。无法将值转换为double。领域: computed_results_A;值:

预期产量是多少?你怎么看? 我希望空字符串转换为NULL或0

您使用的是哪个版本的产品?什么操作系统? 我正在使用MAC OSX YOSEMITE

源JSON lifeData.json         {"模式" {"供应商":" com.bd.snowplow""名称":" in_life" "格式":" jsonschema""版本":" 1-0-2"}"数据" :{"步骤#34;:0," info_userId":" 53493764"" info_campaignCity":"",& #34; info_self_currentAge":45," info_self_gender":"男性"" info_self_retirementAge" 60" info_self_married":假, " info_self_lifeExpectancy":0," info_dependantChildren":0," info_dependantAdults":0," info_spouse_working":真," info_spouse_currentAge&# 34;:33," info_spouse_retirementAge" 60" info_spouse_monthlyIncome":0," info_spouse_incomeInflation":5," info_spouse_lifeExpectancy":0,& #34; info_finances_sumInsured":0," info_finances_expectedReturns":6," info_finances_loanAmount":0," info_finances_liquidateSavings":真," info_finances_savingsAmount&#34 ;:0," info_finances_monthlyExpense" :0," info_finances_expenseInflation":6," info_finances_expenseReduction":10," info_finances_monthlyIncome":0," info_finances_incomeInflation":5,&#34 ; computed_results_A":""" computed_results_B":空," computed_results_C":空," computed_results_D":空,&# 34; uid_epoch":" 53493764_1466504541604""状态":"初始化"" CAMPAIGN_ID":"&#34 ;," campaign_link":""" tool_version":" 20150701-LFI-V1"}"层次结构&#34 ;:{" rootId":" 94583157-af34-4ecb-8024-b9af7c9e54fa"," rootTstamp":" 2016-06-21 10:22 :24.000"" refRoot":"事件"" refTree":["事件"" in_life&# 34]," refParent":"事件"}}

架构JSON lifeSchema.json         {             " name":" computed_results_A",             "输入":" float",             "模式":" nullable"         }

1 个答案:

答案 0 :(得分:2)

尝试将JSON文件作为单列CSV文件加载。

bq load --field_delimiter='|' proj:set.table file.json json:string

将文件加载到BigQuery后,您可以使用JSON_EXTRACT_SCALAR或JavaScript UDF完全自由地解析JSON。