我正在尝试使用SBT编译和打包胖jar,并且我一直遇到以下错误。我已尝试使用库依赖项排除和合并。
[trace] Stack trace suppressed: run last *:assembly for the full output.
[error] (*:assembly) deduplicate: different file contents found in the following:
[error] /Users/me/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1 .7.10.jar:META-INF/maven/org.slf4j/slf4j-api/pom.properties
[error] /Users/me/.ivy2/cache/com.twitter/parquet-format/jars/parquet-format-2.2.0-rc1.jar:META-INF/maven/org.slf4j/slf4j-api/pom.properties
[error] Total time: 113 s, completed Jul 10, 2015 1:57:21 AM
我的build.sbt文件的当前版本如下:
import AssemblyKeys._
assemblySettings
name := "ldaApp"
version := "0.1"
scalaVersion := "2.10.4"
mainClass := Some("myApp")
libraryDependencies +="org.scalanlp" %% "breeze" % "0.11.2"
libraryDependencies +="org.scalanlp" %% "breeze-natives" % "0.11.2"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.3.1"
libraryDependencies +="org.ini4j" % "ini4j" % "0.5.4"
jarName in assembly := "myApp"
net.virtualvoid.sbt.graph.Plugin.graphSettings
libraryDependencies += "org.slf4j" %% "slf4j-api"" % "1.7.10" % "provided"
我意识到我做错了什么......我只是不知道是什么。
答案 0 :(得分:0)
以下是处理这些合并问题的方法。
import sbtassembly.Plugin._
lazy val assemblySettings = sbtassembly.Plugin.assemblySettings ++ Seq(
publishArtifact in packageScala := false, // Remove scala from the uber jar
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case PathList("META-INF", "CHANGES.txt") => MergeStrategy.first
// ...
case PathList(ps @ _*) if ps.last endsWith "pom.properties" => MergeStrategy.first
case x => old(x)
}
}
)
然后将这些设置添加到您的项目中。
lazy val projectToJar = Project(id = "MyApp", base = file(".")).settings(assemblySettings: _*)
答案 1 :(得分:0)
我通过从胖罐(mllib is already included in spark)移除火花来运行你的装配体构建。
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.3.1" % "provided"
就像vitalii在评论中所说,这个解决方案已经here了。我理解,在没有找到修复的情况下花费数小时处理问题可能令人沮丧,但请be nice。