将数据加载到分区表时遇到问题
1)我想将现有表enodeb中的数据加载到分区表enodebpartition中。 enode b的模式是
-state string
-circle string
-businessranking string
-enodebstatus string
-shape binary
2)我创建了以下分区表enodebpartition,如图所示
create table enodebpartition (circle string, businessranking string,
enodebstatus string, shape binary)partitioned by (state string)
ROW FORMAT SERDE 'com.esri.hadoop.hive.serde.JsonSerde'
STORED AS INPUTFORMAT 'com.esri.json.hadoop.UnenclosedJsonInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat';
3)然后我将数据从他已经存在的enodeb表
加载到表中SET hive.exec.dynamic.partition = true;
SET hive.exec.dynamic.partition.mode = nonstrict;
insert overwrite table enodebpartition partition (state) select * from
enodeb;
4)但是我收到以下错误
Diagnostic Messages
for this Task: Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error
while processing row {
"state": "Punjab",
"circle": "PUNJAB",
"businessranking": "2",
"enodebstatus": "SCFT Samsung Acceptance Initiated",
"shape": ף0Kc#������ IA
}
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java: 172) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java: 54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java: 453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java: 168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java: 422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1657) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 162) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error
while processing row {
"state": "Punjab",
"circle": "PUNJAB",
"businessranking": "2",
"enodebstatus": "SCFT Samsung Acceptance Initiated",
"shape": ף0Kc#������ IA
}
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java: 545) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java: 163)...8 more Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.objectinspector.SubStructObjectInspector cannot be cast to org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector at com.esri.hadoop.hive.serde.JsonSerde.serialize(Unknown Source) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java: 712) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java: 838) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java: 88) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java: 838) at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java: 97) at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java: 164) at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java: 535)...9 more FAILED: Execution Error,
return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask MapReduce Jobs Launched: Stage - Stage - 1: Map: 5 HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 0 msec
5)简而言之,我收到错误
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.objectinspector.SubStructObjectInspector cannot be cast to org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector
我需要帮助,因为我无法理解这个错误。 请帮我解决这个问题
由于