使用SparkLauncher运行Spark-Task

时间:2017-03-06 16:48:28

标签: scala apache-spark

在我的本地scala-app中,我想在我的群集中启动Spark任务。任务类是my.spark.SparkRunner,它包含在HDFS中的jar中,这是我在本地程序中配置的:

val spark = new SparkLauncher()
  //.setSparkHome("C:/spark-1.6.0-bin-hadoop2.4")
  .setVerbose(true)
  .setAppResource("hdfs://192.168.10.183:8020/spark/myjar.jar")
  .setMainClass("my.spark.SparkRunner")
  .setMaster("spark://192.168.10.183:7077")
  //.setMaster("192.168.10.183:7077")
  .launch();

spark.waitFor();

它不会抛出任何错误但会立即返回并且不会启动任务。我究竟做错了什么?谢谢......

1 个答案:

答案 0 :(得分:0)

我刚刚添加了一个检查启动器状态的线程,就是这样......

val spark = new SparkLauncher()
  //.setSparkHome("C:/spark-1.6.0-bin-hadoop2.4")
  .setVerbose(true)
  .setAppResource("hdfs://192.168.10.183:8020/spark/myjar.jar")
  .setMainClass("my.spark.SparkRunner")
  .setMaster("spark://192.168.10.183:7077")
  //.setMaster("192.168.10.183:7077")
  .startApplication();

while (spark.getState.toString != "FINISHED") {

    println (spark.getState)

    Thread.sleep(1000)
}