执行spark-shell时出现Apache Spark异常

时间:2016-02-15 19:39:58

标签: apache-spark spark-streaming

我在单个节点上安装了apache-spark。当我运行spark-shell时,我得到以下异常。尽管例外,我仍然可以创建RDD并运行scala代码片段。

这是一个例外:

16/02/15 14:21:29 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/02/15 14:21:31 WARN : Your hostname, Rahul-PC resolves to a loopback/non-reachable address: fe80:0:0:0:c0c1:cd2e:990d:17ac%e
java.lang.RuntimeException: java.lang.NullPointerException
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
        at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
        at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162)
        at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
        at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
        at $iwC$$iwC.<init>(<console>:9)
        at $iwC.<init>(<console>:18)
        at <init>(<console>:20)
        at .<init>(<console>:24)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

我的JAVA_HOME设置为指向正确的jdk安装文件夹。

JAVA_HOME = C:\Program Files\Java\jdk1.8.0

还有什么我需要做的。请指教。

1 个答案:

答案 0 :(得分:0)

我找到了解决方案。 Spark需要winutils.exe才能初始化hive上下文。运行spark shell时创建的C:\ Windows \ tmp文件夹也需要具有足够的权限。

http://blogs.msdn.com/b/arsen/archive/2016/02/09/resolving-spark-1-6-0-quot-java-lang-nullpointerexception-not-found-value-sqlcontext-quot-error-when-running-spark-shell-on-windows-10-64-bit.aspx