如何从Spark运行main方法(在Databricks中)

时间:2017-01-17 16:03:09

标签: scala apache-spark jar databricks

我创建了一个简单的Scala代码:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object app2 {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf)
    println(s"This is a simple scala code v2 - it is running a Spark code.")
    sc.stop()
  }
}

然后用SBT编译,创建了JAR文件。

然后将JAR添加到Spark笔记本中:

sc.addJar("some/path")

如何在Databricks笔记本中运行(调用)此主方法(app2)以查看'println'命令的输出?

1 个答案:

答案 0 :(得分:1)

谢谢' Ronak'。似乎这是scala代码的获胜组合:

/* app4.scala */ 
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object app4 {
  def main(args: Array[String]) {
    val goodSparkContext = SparkContext.getOrCreate()
    println(s"This is a simple scala code v4 - it is running a Spark code.")
  }
}