尝试将数据从数据帧保存到Hive表时出错

时间:2017-06-20 15:51:57

标签: scala apache-spark hive

当我们尝试将数据插入Hive表时,我们遇到以下问题。

  

由于阶段失败导致作业中止:阶段65.0中的任务5失败4   次,最近的失败:第65.0阶段失去的任务5.3(TID 987,   tnblf585.test.sprint.com):java.lang.ArrayIndexOutOfBoundsException:   45点   org.apache.spark.sql.catalyst.expressions.GenericMutableRow.genericGet(rows.scala:254)     在   org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow $ class.getAs(rows.scala:35)     在   org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow $ class.isNullAt(rows.scala:36)     在   org.apache.spark.sql.catalyst.expressions.GenericMutableRow.isNullAt(rows.scala:248)     在   org.apache.spark.sql.hive.execution.InsertIntoHiveTable $$ anonfun $ $有机阿帕奇$ $火花SQL $ $蜂房执行$ $$ InsertIntoHiveTable将writeToFile $ 1 $ 1.适用(InsertIntoHiveTable.scala:107)     在   org.apache.spark.sql.hive.execution.InsertIntoHiveTable $$ anonfun $ $有机阿帕奇$ $火花SQL $ $蜂房执行$ $$ InsertIntoHiveTable将writeToFile $ 1 $ 1.适用(InsertIntoHiveTable.scala:104)     在scala.collection.Iterator $ class.foreach(Iterator.scala:727)at   scala.collection.AbstractIterator.foreach(Iterator.scala:1157)at   org.apache.spark.sql.hive.execution.InsertIntoHiveTable.org $ $阿帕奇火花$ $ SQL蜂房$ $执行$$ InsertIntoHiveTable将writeToFile $ 1(InsertIntoHiveTable.scala:104)     在   org.apache.spark.sql.hive.execution.InsertIntoHiveTable $$ anonfun $ saveAsHiveFile $ 3.apply(InsertIntoHiveTable.scala:84)     在   org.apache.spark.sql.hive.execution.InsertIntoHiveTable $$ anonfun $ saveAsHiveFile $ 3.apply(InsertIntoHiveTable.scala:84)     在org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)     在org.apache.spark.scheduler.Task.run(Task.scala:89)at   org.apache.spark.executor.Executor $ TaskRunner.run(Executor.scala:227)     在   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)     在   java.util.concurrent.ThreadPoolExecutor中的$ Worker.run(ThreadPoolExecutor.java:615)     在java.lang.Thread.run(Thread.java:745)

驱动程序堆栈跟踪:

1 个答案:

答案 0 :(得分:0)

我已经发现数据框和hive表中的一个列名不相同,在列名更正后它已正确加载