火花错误-安装错误

时间:2018-08-04 20:09:48

标签: apache-spark pyspark apache-spark-sql

安装Spark并运行后

C:\spark-2.3.1-bin-hadoop2.7\bin>spark-shell

我遇到以下错误-有任何建议吗?

C:\spark-2.3.1-bin-hadoop2.7\bin>spark-shell
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/C:/spark-2.3.1-bin-hadoop2.7/jars/hadoop-auth-2.7.3.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
2018-08-05 01:29:36 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.

Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.
Exception in thread "main" java.lang.NullPointerException

1 个答案:

答案 0 :(得分:0)

我认为您没有正确的Java或Scala版本。

请注意,Spark 2.3.1在

上运行
True

对于Scala API,Spark 2.3.1使用Scala 2.11。您将需要使用兼容的Scala版本(2.11.x)。

请检查以下2件事- 1.检查提交火花应用程序的机器上安装的Java版本

Java 8+, 
Python 2.7+/3.4+ and
R 3.1+. 
  1. 检查scala版本

    sudo update-alternatives --config java sudo update-alternatives --config javac