Hive On Spark:java.lang.NoClassDefFoundError:org / apache / hive / spark / client / Job

时间:2015-10-20 09:52:43

标签: apache-spark hive apache-spark-sql

当我在调试模式下在hive控制台上运行查询时,出现了如下所列的错误。我使用hive-1.2.1和spark 1.5.1;我检查了hive-exec jar,它有类定义org/apache/hive/spark/client/Job

Caused by: java.lang.NoClassDefFoundError: org/apache/hive/spark/client/Job
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:792)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:270)
    at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:136)
    at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:115)
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:99)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
    at org.apache.hive.spark.client.rpc.KryoMessageCodec.decode(KryoMessageCodec.java:96)
    at io.netty.handler.codec.ByteToMessageCodec$1.decode(ByteToMessageCodec.java:42)
    at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:327)
    ... 15 more*

最后查询失败了:

  

"错误spark.SparkTask:执行spark任务失败,异常' java.lang.IllegalStateException(RPC通道已关闭。)'" *

如何解决此问题?

1 个答案:

答案 0 :(得分:1)

在hive-1.2.1 pom.xml中,spark.version是1.3.1

所以,简单的方法就是从spark.apache.org下载一个spark-1.3.1-bin-hadoop。

然后,将它的路径添加到hive-site.xml,如:

<property>
  <name>spark.home</name>
  <value>/path/spark-1.3.1-bin-hadoop2.4</value>
</property>