我下载了预先构建的Spark 1.5.1并运行了“./sbin/start-master.sh”脚本。它给了我以下错误。
rsync from spark://mycomputer:7077
ssh: Could not resolve hostname spark: Name or service not known
rsync: connection unexpectedly closed (0 bytes received so far) [Receiver]
rsync error: unexplained error (code 255) at io.c(605) [Receiver=3.0.9]
starting org.apache.spark.deploy.master.Master, logging to /root/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-mycomputer.out
failed to launch org.apache.spark.deploy.master.Master:
at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
Could not find the main class: org.apache.spark.launcher.Main. Program will exit.
full log in /root/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-mycomputer.out
现在我通过删除环境变量“SPARK_HOME”和“SPARK_HOME”解决了rsync问题。
但我仍然收到以下错误。
failed to launch org.apache.spark.deploy.master.Master:
at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
Could not find the main class: org.apache.spark.launcher.Main. Program will exit.
full log in /root/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-mycomputer.out
有谁知道任何已知问题?