我正在使用UDF从HIVE表中提取数据。我的源配置单元表在json中有列,在我的UDF中,我正在解析json并获取值。但是查询发布失败:
java.lang.ClassCastException:org.apache.hadoop.io.Text无法转换 到org.apache.hadoop.io.ArrayWritable org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.write(ParquetRecordWriterWrapper.java:88) 在 org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:645)
”
完整的错误消息:
错误:java.lang.RuntimeException:java.lang.ClassCastException:org.apache.hadoop.io.Text无法转换为org.apache.hadoop.io.ArrayWritable 在org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:283) 在org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444) 在org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) 在org.apache.hadoop.mapred.YarnChild $ 2.run(YarnChild.java:163) 在java.security.AccessController.doPrivileged(本机方法) 在javax.security.auth.Subject.doAs(Subject.java:422) 在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595) 在org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 由以下原因引起:java.lang.ClassCastException:org.apache.hadoop.io.Text无法转换为org.apache.hadoop.io.ArrayWritable 在org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.write(ParquetRecordWriterWrapper.java:88) 在org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:645) 在org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:793) 在org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:87) 在org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:793) 在org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:638) 在org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genAllOneUniqueJoinObject(CommonJoinOperator.java:670) 在org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:754) 在org.apache.hadoop.hive.ql.exec.JoinOperator.endGroup(JoinOperator.java:256) 在org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:216) ...还有7个