在Windows上以独立模式运行的Spark无法正常运行

时间:2018-11-16 11:13:51

标签: windows apache-spark

在一台Windows计算机上,我同时运行Spark Master和Worker。 我正在测试我的应用程序代码。因此,我只经营1名工人。

注意:我正在使用cygwin来启动master和worker。

主启动命令:

java -cp "C:\Spark\spark-2.2.2-bin-hadoop2.7/conf\;C:\Spark\spark-2.2.2-bin-hadoop2.7\jars\*"
     -Xmx1g org.apache.spark.deploy.master.Master --host 10.111.30.22 --port 7077 
     --webui-port 8080

工人启动命令:

java -cp "C:\Spark\spark-2.2.2-bin-hadoop2.7/conf\;C:\Spark\spark-2.2.2-bin-hadoop2.7\jars\*" 
     -Xmx1g org.apache.spark.deploy.worker.Worker 
     --webui-port 8081 spark://10.111.30.22:7077 --cores 1 --memory 1g

启动后,我可以看到spark master和worker。

我又打开了另一个提示,然后运行以下命令来测试Spark示例示例代码:

./bin/spark-submit --class org.apache.spark.examples.SparkPi 
    --master spark://10.111.30.22:7077 --executor-memory 2G 
    --total-executor-cores 2 
    C:\Spark\spark-2.2.2-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.2.2.jar 2

我在工作人员附加的图像中遇到错误。这项工作永远不会完成,永远持续下去。

以下是在执行程序控制台上转储的错误:

$ java  -cp "C:\Spark\spark-2.2.2-bin-hadoop2.7/conf\;C:\Spark\spark-2.2.2-bin-hadoop2.7\jars*" -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://10.111.30.22:7077 --cores 1 --memory 1g
18/11/19 13:40:32 ERROR ExecutorRunner: Error running executor
java.lang.IllegalStateException: Cannot find any build directories.
        at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
        at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:240)
        at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:194)
        at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:117)
        at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
        at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:45)
        at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:63)
        at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:51)
        at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:145)
        at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)
18/11/19 13:40:32 ERROR ExecutorRunner: Error running executor
java.lang.IllegalStateException: Cannot find any build directories.
        at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
        at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:240)
        at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:194)
        at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:117)
        at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
        at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:45)
        at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:63)
        at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:51)
        at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:145)
        at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)
18/11/19 13:40:32 ERROR ExecutorRunner: Error running executor
java.lang.IllegalStateException: Cannot find any build directories.
        at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
        at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:240)
        at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:194)
        at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:117)
        at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
        at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:45)
        at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:63)
        at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:51)
        at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:145)
        at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)
18/11/19 13:40:33 ERROR ExecutorRunner: Error running executor,

================================================ ======================

环境: 操作系统:Windows。使用Cygwin运行Spark Master和Worker。 只有一名工人用于测试。

我在Spark Master控制台中没有看到任何错误。我看到创建了许多执行程序,并且状态失败。没有执行者可以完成。

有人遇到这个问题吗?解决办法是什么?非常感谢您的回应。

我正在使用以下命令手动启动Master和Worker: 主: java -cp "C:\Spark\spark-2.2.2-bin-hadoop2.7/conf\;C:\Spark\spark-2.2.2-bin-hadoop2.7\jars\*" -Xmx1g org.apache.spark.deploy.master.Master --host 10.111.30.22 --port 7077 --webui-port 8080

工人: java -cp "C:\Spark\spark-2.2.2-bin-hadoop2.7/conf\;C:\Spark\spark-2.2.2-bin-hadoop2.7\jars\*" -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://10.111.30.22:7077 --cores 1 --memory 1g

Spark Pi工作: spark-submit --class org.apache.spark.examples.SparkPi --master spark://10.111.30.22:7077 C:\Spark\spark-2.2.2-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.2.2.jar 2

0 个答案:

没有答案