火花壳不起作用

时间:2018-01-03 17:23:40

标签: scala apache-spark

当我安装Spark时,我在终端中键入spark-shell,但它不起作用 它显示:

Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
Note that as of 2.8 scala does not assume use of the java classpath.
For the old behavior pass -usejavacp to scala, or if using a Settings
object programmatically, settings.usejavacp.value = true.**

根据我在此页面上搜索的解决方案:http://blog.csdn.net/xiaomin1991222/article/details/50981584,我修改了/spark-2.2.0-bin-hadoop2.7/bin/sprak-class2.cmd添加此代码

rem Set JAVA_OPTS to be able to load native libraries and to set heap size   
set JAVA_OPTS=%OUR_JAVA_OPTS% -Djava.library.path=%SPARK_LIBRARY_PATH%  -Xms%SPARK_MEM% -Xmx%SPARK_MEM%    -Dscala.usejavacp=true**

但是,它也不起作用。我的JDK和JRE正常工作。

0 个答案:

没有答案