Spark 1.5.1 spark-shell抛出RuntimeException

时间:2015-10-20 10:32:05

标签: apache-spark

我只是尝试在我的本地Windows 8上启动spark shell,这是我收到的错误消息:

java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are:
 rw-rw-rw-
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
    at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
    at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162)
    at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
    at java.lang.reflect.Constructor.newInstance(Unknown Source)
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
    at $iwC$$iwC.<init>(<console>:9)
    at $iwC.<init>(<console>:18)
    at <init>(<console>:20)
    at .<init>(<console>:24)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)

Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-
    at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
    at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
    ... 56 more

不知何故REPL在这里,但我不能使用sqlContext ..

以前有人遇到过这个问题吗?任何答案都会有所帮助,谢谢。

3 个答案:

答案 0 :(得分:2)

已解决:已下载正确的Winutils版本,问题已解决。理想情况下,它应该在本地编译,但如果下载编译版本,请确保它是32/64位适用。 我尝试了Windows 7 64位,Spark 1.6并从https://www.barik.net/archive/2015/01/19/172716/下载了winutils.exe并且它工作了!! !! 完整步骤位于:http://letstalkspark.blogspot.com/2016/02/getting-started-with-spark-on-window-64.html

答案 1 :(得分:2)

首先,您需要为火花和操作系统下载正确的兼容winutils.exe。将其放在文件夹中的某个位置,然后是 bin 目录。 让我们说D:\winutils\bin\winutils.exe

现在如果D:驱动器中存在/ tmp / hive,请运行以下命令:

D:\winutils\bin\winutils.exe chmod 777 D:\tmp\hive

有关详细信息,请参阅以下帖子:

Frequent Issues occurred during Spark Development

https://issues.apache.org/jira/browse/SPARK-10528

答案 2 :(得分:0)

在这种情况下,这可能会有所帮助: 的 https://issues.apache.org/jira/browse/SPARK-10528