Hive-Spark错误 - java.lang.IllegalStateException:未读块数据

时间:2015-11-03 16:05:54

标签: apache-spark hive

在配置Hive以使用Spark之后,我一直在尝试在Hive CLI上运行配置单元查询。

spark.masterlocal时它运行正常,但当我将其设置为我的spark master spark://spark-master:7077时,我在Spark日志中收到以下错误:

15/11/03 16:37:10 INFO util.Utils: Copying /tmp/spark-5e39df85-d3d7-446f-86e9-d2699501f97e/executor-70d24a32-6913-479d-85b8-32e535dd3dbf/-11208827301446565026180_cache to /usr/local/spark/work/app-20151103163705-0000/0/./hive-exec-1.2.1.jar
15/11/03 16:37:11 INFO executor.Executor: Adding file:/usr/local/spark/work/app-20151103163705-0000/0/./hive-exec-1.2.1.jar to class loader
15/11/03 16:37:11 ERROR executor.Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.IllegalStateException: unread block data
    at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2428)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1382)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:95)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)

我使用Spark 1.4.1和Hive 1.2.1

1 个答案:

答案 0 :(得分:1)

对于可能遇到同样问题的其他人,我设法解决了这个问题并通过了它,我认为这是执行者一方的HBase jar(只有在运行涉及HBase的查询时才会发生)通过配置单元,仅在火花集群模式下。)

我的解决方案是添加到spark-env.sh:

export SPARK_CLASSPATH=$CLASSPATH

export SPARK_CLASSPATH=/usr/local/hbase-1.1.2/lib/hbase-protocol-1.1.2.jar:/usr/local/hbase-1.1.2/lib/hbase-common-1.1.2.jar:/usr/local/hbase-1.1.2/lib/htrace-core-3.1.0-incubating.jar:/usr/local/hbase-1.1.2/lib/hbase-server-1.1.2.jar:/usr/local/hbase-1.1.2/lib/hbase-client-1.1.2.jar:/usr/local/hive-1.2.1/lib/hive-hbase-handler-1.2.1.jar:/usr/local/hive-1.2.1/lib/hive-common-1.2.1.jar:/usr/local/hive-1.2.1/lib/hive-exec-1.2.1.jar

或者,可以添加到hive-site.xml:

  <property>
    <name>spark.executor.extraClassPath</name>
    <value>/usr/local/hbase-1.1.2/lib/hbase-protocol-1.1.2.jar:/usr/local/hbase-1.1.2/lib/hbase-common-1.1.2.jar:/usr/local/hbase-1.1.2/lib/htrace-core-3.1.0-incubating.jar:/usr/local/hbase-1.1.2/lib/hbase-server-1.1.2.jar:/usr/local/hbase-1.1.2/lib/hbase-client-1.1.2.jar:/usr/local/hive-1.2.1/lib/hive-hbase-handler-1.2.1.jar:/usr/local/hive-1.2.1/lib/hive-common-1.2.1.jar:/usr/local/hive-1.2.1/lib/hive-exec-1.2.1.jar</value>
  </property>