我尝试在将sbt-assembly升级到0.13.8后构建一个spark应用程序。这是我现在拥有的build.sbt的内容。但这并不奏效。这似乎是默认mergeStrategy的一个问题。我得到了大约600多个重复数据删除错误
name := "volumeApp"
version := "0.0.1"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming-kafka" % "1.3.1",
"org.apache.spark" %% "spark-core" % "1.3.1",
"org.apache.spark" %% "spark-streaming" % "1.3.1",
"org.apache.kafka" %% "kafka" % "0.8.2.1",
"com.datastax.spark" %% "spark-cassandra-connector" % "1.3.0-M1",
"com.github.vonnagy" %% "service-container-metrics-reporting" % "1.0.1" exclude("com.codahale.metrics", "metrics-core"),
"joda-time" % "joda-time" % "2.7",
"log4j" % "log4j" % "1.2.14"
)
assemblyJarName in assembly := "inventoryVolume.jar"
assemblyMergeStrategy in assembly := {
case PathList("META-INF", "jboss-beans.xml") => MergeStrategy.first
case PathList("META-INF", "mailcap") => MergeStrategy.discard
case PathList("META-INF", "maven", "org.slf4j", "slf4j-api", xa @ _*) => MergeStrategy.rename
case PathList("META-INF", "ECLIPSEF.RSA") => MergeStrategy.discard
case PathList("META-INF", "mimetypes.default") => MergeStrategy.first
case PathList("com", "datastax", "driver", "core", "Driver.properties") => MergeStrategy.last
case PathList("com", "esotericsoftware", "minlog", xx @ _*) => MergeStrategy.first
case PathList("plugin.properties") => MergeStrategy.discard
case PathList("javax", "activation", xw @ _*) => MergeStrategy.first
case PathList("org", "apache", "hadoop", "yarn", xv @ _*) => MergeStrategy.first
case PathList("org", "apache", "commons", xz @ _*) => MergeStrategy.first
case PathList("org", "jboss", "netty", ya @ _*) => MergeStrategy.first
case x =>
val baseStrategy = (assemblyMergeStrategy in assembly).value
baseStrategy(x)
}
我已在此处附加错误http://pastebin.com/T6HRJ6Kv 有人可以帮助我..我是全新的构建工具。我尝试阅读sbt指南,但很快就丢失了。
答案 0 :(得分:0)
您似乎正在尝试构建一个包含多个版本的akka actor系统的程序集jar。有一些可能的解决方案,一种是将它从您的依赖项中排除(请参阅http://www.scala-sbt.org/0.13/docs/Library-Management.html获取一个想法)或更改您的合并策略,但我建议使用排除依赖项方法。