构建Scala JAR文件Spark

时间:2018-03-22 09:30:33

标签: scala apache-spark

我的build.sbt文件(我正在使用IntelliJ)

scalaVersion := "2.11.8"
resolvers  += "MavenRepository" at "http://central.maven.org/maven2"
resolvers += "spark-packages" at "https://dl.bintray.com/spark-packages/maven/"

libraryDependencies ++= {
  val sparkVersion = "2.2.1"
    Seq( "org.apache.spark" %% "spark-core" % sparkVersion )
}

我正在尝试构建一个JAR并将其部署到Spark中。我发出了以下命令

sbt compile

sbt assembly

编译成功但装配失败并显示以下错误消息

java.lang.RuntimeException: Please add any Spark dependencies by supplying the sparkVersion and sparkComponents. Please remove: org.apache.spark:spark-core:2.2.1

我尝试添加"provided"以防止编译本身失败,因为"provided"关键字 不包括那些JAR

我在做什么错?

1 个答案:

答案 0 :(得分:1)

首先需要为程序集添加插件和依赖项,这将为您创建jar。

在plugins.sbt

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")

build.sbt

中添加此内容
mainClass := Some("name of jar")
assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case x => MergeStrategy.first
}

您可以参考我的github创建jar并部署