我有一个用Scala编写的spark-job(让我们称之为wordcount),我可以按照以下方式运行
在sbt
中运行本地spark实例SBT> runMain WordCount [InputFile] [Otuputdir] local [*]
在远程火花群上运行spark-submit the jar
SBT>封装
$> spark-submit --master spark://192.168.1.1:7077 --class WordCount target / scala-2.10 / wordcount_2.10-1.5.0-SNAPSHOT.jar [InputFile] [Otuputdir]
代码:
// get arguments
val inputFile = args(0)
val outputDir = args(1)
// if 3rd argument defined then use it
val conf = if ( args.length == 3 ) new SparkConf().setAppName("WordCount").setMaster(args(2)) else new SparkConf().setAppName("WordCount")
val sc = new SparkContext(conf)
如何在SBT的远程火花群集上运行此作业?