无法运行Spark

时间:2014-11-15 21:03:56

标签: java python apache-spark

几天前,我在我的机器(Ubuntu)上提取了Spark并进行了测试,一切看起来都很好。今天,我想我改变了一些Java路径,现在Spark不会启动  相反,我收到以下错误消息:

user@user:~/Software/spark-1.1.0-bin-hadoop2.4$ ./bin/pyspark
Python 2.7.8 (default, Oct 20 2014, 15:05:19) 
[GCC 4.9.1] on linux2
Type "help", "copyright", "credits" or "license" for more information.
/home/user/Software/spark-1.1.0-bin-hadoop2.4/bin/spark-class: line 180: /usr/lib/jvm/java-7-sun/bin/bin/java: No such file or directory
Traceback (most recent call last):
  File "/home/user/Software/spark-1.1.0-bin-hadoop2.4/python/pyspark/shell.py", line 44, in <module>
    sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
  File "/home/user/Software/spark-1.1.0-bin-hadoop2.4/python/pyspark/context.py", line 104, in __init__
    SparkContext._ensure_initialized(self, gateway=gateway)
  File "/home/user/Software/spark-1.1.0-bin-hadoop2.4/python/pyspark/context.py", line 211, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway()
  File "/home/user/Software/spark-1.1.0-bin-hadoop2.4/python/pyspark/java_gateway.py", line 71, in launch_gateway
    raise Exception(error_msg)
Exception: Launching GatewayServer failed with exit code 127!
Warning: Expected GatewayServer to output a port, but found no output.

使用Eclipse运行Java程序仍然有效。

修改

which java: /usr/bin/java
javac -version: javac 1.7.0_65
echo $JAVA_HOME: /usr/lib/jvm/java-7-sun/bin

1 个答案:

答案 0 :(得分:1)

您的错误消息包含路径/usr/lib/jvm/java-7-sun/bin/bin/java。请注意重复的bin片段。

bin不得成为JAVA_HOME的一部分,请将其设为/usr/lib/jvm/java-7-sun/