无法启动火花壳

时间:2018-07-26 04:46:57

标签: apache-spark pyspark

我无法解决此问题。我已经安装了hadoop,并且工作正常。 Hadoop路径是正确的。

但是,当我尝试启动spark时,出现以下错误

    /usr/local/spark-2.3.1-bin-hadoop2.7/bin$ spark-shell
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder
    at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:205)
    at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:119)
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:108)
    at org.apache.spark.deploy.SparkSubmit$.initializeLogIfNecessary(SparkSubmit.scala:71)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:128)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.slf4j.impl.StaticLoggerBinder
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 6 more

我已经搜索了各种替代方法,并进行了尝试,但是错误仍然相同。

这是我的~/.bashrc文件。

    export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_161
export PATH=$PATH:JAVA_HOME/bin
export HADOOP_HOME=/usr/local/hadoop

export PATH=$PATH:$HADOOP_HOME/bin

export HIVE_HOME=/usr/local/apache-hive-3.0.0-bin
export HIVE_CONF=/usr/local/apache-hive-3.0.0-bin/conf
export PATH=$HIVE_HOME/bin:$PATH
export CLASSPATH=$CLASSPATH:/usr/local/hadoop/lib/*:.
export CLASSPATH=$CLASSPATH:/usr/local/apache-hive-3.0.0-bin/lib/*:.
export SPARK_HOME=/usr/local/spark-2.3.1-bin-hadoop2.7
export SPARK_CONF=/usr/local/spark-2.3.1-bin-hadoop2.7/conf
export PATH=$SPARK_HOME/bin:$PATH
export CLASSPATH=$CLASSPATH:/usr/local/spark-2.3.1-bin-hadoop2.7/lib/*:.

此外,sudo update-alternative --config java和`sudo update-alternative --config javac

    There are 2 choices for the alternative java (providing /usr/bin/java).

  Selection    Path                                            Priority   Status
------------------------------------------------------------
  0            /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java   1081      auto mode
  1            /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java   1081      manual mode
* 2            /usr/lib/jvm/jdk1.8.0_161/bin/java               0         manual mode

我发现的另一个问题是java's位置仍然是/usr/bin/java,而不是我设置的导出路径。

请让我知道如何解决这个问题。

1 个答案:

答案 0 :(得分:0)

  

打开{SPARK_HOME} /conf/spark-env.sh

在spark-env.sh中添加以下条目

export SPARK_DIST_CLASSPATH=$(hadoop classpath)

重新启动火花

这应该可以解决问题。基本上是将hadoop类路径添加到spark。