我创建了一个简单的Scala代码:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object app2 {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
println(s"This is a simple scala code v2 - it is running a Spark code.")
sc.stop()
}
}
然后用SBT编译,创建了JAR文件。
然后将JAR添加到Spark笔记本中:
sc.addJar("some/path")
如何在Databricks笔记本中运行(调用)此主方法(app2)以查看'println'命令的输出?
答案 0 :(得分:1)
谢谢' Ronak'。似乎这是scala代码的获胜组合:
/* app4.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object app4 {
def main(args: Array[String]) {
val goodSparkContext = SparkContext.getOrCreate()
println(s"This is a simple scala code v4 - it is running a Spark code.")
}
}