如何在我的scala代码中添加“--deploy-mode cluster”选项

时间:2017-05-11 12:28:53

标签: scala apache-spark spark-streaming apache-spark-standalone

五千分之二百〇九 你好 我想在我的代码scala中添加选项“--deploy-mode cluster”:

  val sparkConf = new SparkConfig ().setMaster ("spark: //192.168.60.80:7077")

不使用shell(命令。\ Spark-submit)

我想在scala中使用“spark.submit.deployMode”

2 个答案:

答案 0 :(得分:4)

使用SparkConfig:

//set up the spark configuration and create contexts
val sparkConf = new SparkConf().setAppName("SparkApp").setMaster("spark: //192.168.60.80:7077")

val sc = new SparkContext(sparkConf).set("spark.submit.deployMode", "cluster")

使用SparkSession:

val spark = SparkSession
   .builder()
   .appName("SparkApp")
   .master("spark: //192.168.60.80:7077")
   .config("spark.submit.deployMode","cluster")
   .enableHiveSupport()
   .getOrCreate()

答案 1 :(得分:2)

您可以使用

 val sparkConf = new SparkConf ().setMaster ("spark: //192.168.60.80:7077").set("spark.submit.deployMode","cluster")