Apache Spark入门

时间:2018-12-11 17:11:56

标签: apache-spark

我已将11节点集群配置为运行无密码,设置了JAVA_HOME,SPARK_HOME和SCALA_HOME变量,并为用户设置了对Spark子目录的rwx权限。当我尝试运行示例之一时,Web门户显示我11个节点上的执行程序进程被杀死。 当我跑步时:

bin/spark-submit --deploy-mode cluster --master spark://192.168.1.10:6066 --class org.apache.spark.examples.JavaWordCount --files /home/clusternode/wordTest examples/jars/spark-examples_2.11-2.3.2.jar file://wordTest

我进入每个节点的stderr:

Spark Executor Command: "/usr/lib/jvm/java-8-openjdk-amd64/bin/java" "-cp" "/usr/local/spark/conf/:/usr/local/spark/jars/*" "-Xmx1024M" "-Dspark.driver.port=46481" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" "spark://CoarseGrainedScheduler@clusterNode10:46481" "--executor-id" "2" "--hostname" "192.168.1.11" "--cores" "2" "--app-id" "app-20181211105202-0002" "--worker-url" "spark://Worker@192.168.1.11:42779"
========================================

有人可以为此指出正确的方向吗?我不确定它是否与类路径有关,因为我尝试将一些添加到spark-defaults.conf中,并且已将jar复制到conf目录中。

0 个答案:

没有答案