无法运行Spark spark-shell

时间:2018-01-14 20:39:24

标签: apache-spark

帖子Cannot start spark-shell中的信息无效

设定下一个环境:

java -version java版本" 9.0.1" Java(TM)SE运行时环境(版本9.0.1 + 11) Java HotSpot(TM)64位服务器VM(版本9.0.1 + 11,混合模式)

而Oracle Java 8,OpenJDK 8也很恐怖

Scala代码运行器版本2.12.4和版本2.10错误

来自apache.org的spark二进制文件spark-2.2.1-bin-hadoop2.6 Hadoop版本是2.6

JAVA_HOME=/usr/lib/jvm/java-9-oracle
env | grep spark
SPARK_HOME=/usr/local/spark
env | grep scala
SCALA_HOME=/usr/local/scala
env | grep hadoop
HADOOP_HOME=/usr/local/hadoop

PATH=/usr/lib/jvm/java-9-oracle/bin:
/usr/lib/jvm/java-9-oracle/db/bin:
/usr/local/scala/bin:/usr/local/spark/bin:
/usr/local/scala/bin

SPARK_DIST_CLASSPATH="$HADOOP_HOME/etc/hadoop/*:
$HADOOP_HOME/share/hadoop/common/lib/*:
$HADOOP_HOME/share/hadoop/common/*:$HADOOP_HOME/share/hadoop/hdfs/*:
$HADOOP_HOME/share/hadoop/hdfs/lib/*:$HADOOP_HOME/share/hadoop/hdfs/*:
$HADOOP_HOME/share/hadoop/yarn/lib/*:$HADOOP_HOME/share/hadoop/yarn/*:
$HADOOP_HOME/share/hadoop/mapreduce/lib/*:$HADOOP_HOME/share/hadoop/mapreduce/*:
$HADOOP_HOME/share/hadoop/tools/lib/*"    

运行spark:

spark-shell

输出:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/spark/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Setting default log level to "WARN".

Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.

Exception in thread "main" java.lang.NullPointerException
at scala.reflect.internal.SymbolTable.exitingPhase(SymbolTable.scala:256)
at scala.tools.nsc.interpreter.IMain$Request.x$20$lzycompute(IMain.scala:896)
at scala.tools.nsc.interpreter.IMain$Request.x$20(IMain.scala:895)
at scala.tools.nsc.interpreter.IMain$Request.headerPreamble$lzycompute(IMain.scala:895)

是什么原因???

2 个答案:

答案 0 :(得分:3)

Spark 2.2.x还无法与Java 9一起运行。将配置更改为使用Java 8.

集:

JAVA_HOME=/usr/lib/jvm/java-8-oracle
PATH="$JAVA_HOME/bin:"$PATH

确保Java 8是默认版本:

sudo update-alternatives --config java
sudo update-alternatives --config javac
sudo update-alternatives --config javah

如果全部失败,请卸载Java 9。

sudo apt-get purge oracle-java9-installer
sudo add-apt-repository --remove ppa:webupd8team/java

我希望有所帮助。

答案 1 :(得分:0)

我安装了java9和java8。简单地设置JAVA_HOME对我有用。

我正在使用env JAVA_HOME=/usr/lib/jvm/java-8-openjdk/jre/ ./spark-shell(在bash:JAVA_HOME=/usr/lib/jvm/java-8-openjdk/jre/ ./spark-shell中)(Spark 2.2.1,scala 2.11.8)