SparkContext无法以将master设置为“ Yarn”的方式启动

时间:2018-12-17 10:56:41

标签: scala apache-spark

我是Apache Spark的新手,它试图在Scala API(播放框架)中运行SparkContext。当我将Spark master设置为“本地”时,它可以正常工作,但是,当我将master设置为“ YARN”时,它会引发异常:

[SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.]

当我检查容器的日志时,得到以下信息:

Error: Could not find or load main class org.apache.spark.deploy.yarn.ExecutorLauncher

如果我运行spark-shell --master yarn,它将运行SparkContext,没有任何问题。

这是我的代码:

 val sparkS = SparkSession.builder
  .config("spark.hadoop.validateOutputSpecs", "false")
  .config("spark.executor.memory", "4g")
  .config("spark.driver.memory", "3g")
  .config("spark.rpc.message.maxSize", "2047")
  .config("SPARK_DIST_CLASSPATH", "/usr/local/spark/jars/*")
  .config("spark.yarn.archive", "hdfs://localhost:54310/spark-libs.jar")
  .config("spark.yarn.jars", "/usr/local/spark/jars/*")
  .config("spark.executor.extraJavaOptions", "-XX:+PrintGCDetails -Dkey=value -Dnumbers=\"one two three\"")
  .config("spark.executor.extraLibraryPath", "/usr/local/hadoop-2.8.5/lib/native:/usr/local/hadoop-2.8.5/lib/native/Linux-amd64-64")
  .config("HADOOP_CONF_DIR", "/usr/local/hadoop-2.8.5/etc/hadoop")
  .config("spark.yarn.am.waitTime", "1d")
  .master("yarn").getOrCreate

有人可以提出解决方案吗? 谢谢

1 个答案:

答案 0 :(得分:0)

提交火花罐时必须提到主类。下面是格式

  ./bin/spark-submit \
  --class <main-class> \
  --master <master-url> \
  --deploy-mode <deploy-mode> \
  --conf <key>=<value> \
  ... # other options
  <application-jar> \
  [application-arguments]