在发生以下错误时出现Geinging下面的错误
线程“ main” org.apache.spark.sql.AnalysisException中的异常: 给定输入列,无法解析“
time_period
”: [_corrupt_record] ;; '项目['time_period] +-关系[_corrupt_record#0] json
val dataDF = spark.read.format("json").load("\path of json\data.json")
dataDF.select($"time_period").groupBy($"time_period").count().show()
JSON文件:
{"name":"Yin",
"time_Period": "10" ,
address":{"city":"Columbus","state":"Ohio"}}
任何解决方案/想法