使用start-all.sh启动spark时出错

时间:2017-04-05 12:24:01

标签: scala apache-spark spark-dataframe

当我尝试使用脚本start-all.sh启动spark时,它会抛出错误

> localhost: failed to launch: nice -n 0 bin/spark-class
> org.apache.spark.deploy.worker.Worker --webui-port 8081
> spark://dev-pipeline-west-eu.jwn4tgenexauzewylryxtm545b.ax.internal.cloudapp.net:7077
> localhost:       at
> sun.launcher.LauncherHelper.loadMainClass(java.base@9-internal/LauncherHelper.java:585)
> localhost:       at
> sun.launcher.LauncherHelper.checkAndLoadMain(java.base@9-internal/LauncherHelper.java:497)
> localhost: full log in
> /spark-2.1.0-bin-hadoop2.7/logs/spark-shankar-org.apache.spark.deploy.worker.Worker-1-dev-pipeline-west-eu.out

当我查看/spark-2.1.0-bin-hadoop2.7/logs/spark-shankar-org.apache.spark.deploy.worker.Worker-1-dev-pipeline-west-eu.out处可用的日志文件时,出现以下错误日志。

> Error: A JNI error has occurred, please check your installation and
> try again Exception in thread "main"
> java.lang.ArrayIndexOutOfBoundsException: 64
>     at java.util.jar.JarFile.match(java.base@9-internal/JarFile.java:983)
>     at java.util.jar.JarFile.checkForSpecialAttributes(java.base@9-internal/JarFile.java:1017)
>     at java.util.jar.JarFile.isMultiRelease(java.base@9-internal/JarFile.java:399)
>     at java.util.jar.JarFile.getEntry(java.base@9-internal/JarFile.java:524)
>     at java.util.jar.JarFile.getJarEntry(java.base@9-internal/JarFile.java:480)
>     at jdk.internal.util.jar.JarIndex.getJarIndex(java.base@9-internal/JarIndex.java:114)

导致错误的原因是什么?

2 个答案:

答案 0 :(得分:3)

我和Ubuntu 16.04有同样的问题。 更新Java修复了问题:

sudo apt-add-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-java7-installer

java -version

java version "1.8.0_131"
Java(TM) SE Runtime Environment (build 1.8.0_131-b11)
Java HotSpot(TM) 64-Bit Server VM (build 25.131-b11, mixed mode)

答案 1 :(得分:-1)

解决方案 使用Java版本8而不是版本9。

选项1 一种选择是卸载Java(版本9)并重新安装Java(版本8)。 (您可以查看有关安装Java的文章;请确保进行了必要的更改,以便安装版本8。)

选项2 如果已经安装并且正在使用Ubuntu,则可以使用以下命令:

sudo update-alternatives --config java

要看到提示,请使用与Java 8相关的整数进行响应,然后按Enter。

表格: http://continualintegration.com/miscellaneous-articles/how-do-you-troubleshoot-the-spark-shell-error-a-jni-error-has-occurred