sbt程序集着色创建胖jar以在spark上运行

时间:2017-08-31 19:48:02

标签: apache-spark sbt guava grpc sbt-assembly

我正在使用sbt程序集来创建一个可以在spark上运行的胖罐。对grpc-netty有依赖关系。 spark上的Guava版本比grpc-netty所需的版本更旧,我遇到了这个错误:java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument。我能够通过在spark上将userClassPathFirst设置为true来解决此问题,但会导致其他错误。

如果我错了,请纠正我,但根据我的理解,如果我正确地进行着色,我不应该将userClassPathFirst设置为true。这是我现在的阴影方式:

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("com.google.guava.**" -> "my_conf.@1")
    .inLibrary("com.google.guava" % "guava" % "20.0")
    .inLibrary("io.grpc" % "grpc-netty" % "1.1.2")
)

libraryDependencies ++= Seq(
  "org.scalaj" %% "scalaj-http" % "2.3.0",
  "org.json4s" %% "json4s-native" % "3.2.11",
  "org.json4s" %% "json4s-jackson" % "3.2.11",
  "org.apache.spark" %% "spark-core" % "2.2.0" % "provided",
  "org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided",
  "org.clapper" %% "argot" % "1.0.3",
  "com.typesafe" % "config" % "1.3.1",
  "com.databricks" %% "spark-csv" % "1.5.0",
  "org.apache.spark" % "spark-mllib_2.11" % "2.2.0" % "provided",
  "io.grpc" % "grpc-netty" % "1.1.2",
  "com.google.guava" % "guava" % "20.0"
)

我在这里做错了什么,如何解决?

1 个答案:

答案 0 :(得分:5)

你快到了。 shadeRule的作用是renames class names,而不是库名:

  

主要的ShadeRule.rename规则用于重命名类。所有对重命名的类的引用也将更新。

事实上,在com.google.guava:guava中,没有包含com.google.guava的类:

$ jar tf ~/Downloads/guava-20.0.jar  | sed -e 's:/[^/]*$::' | sort | uniq
META-INF
META-INF/maven
META-INF/maven/com.google.guava
META-INF/maven/com.google.guava/guava
com
com/google
com/google/common
com/google/common/annotations
com/google/common/base
com/google/common/base/internal
com/google/common/cache
com/google/common/collect
com/google/common/escape
com/google/common/eventbus
com/google/common/graph
com/google/common/hash
com/google/common/html
com/google/common/io
com/google/common/math
com/google/common/net
com/google/common/primitives
com/google/common/reflect
com/google/common/util
com/google/common/util/concurrent
com/google/common/xml
com/google/thirdparty
com/google/thirdparty/publicsuffix

将阴影规则更改为:

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("com.google.common.**" -> "my_conf.@1")
    .inLibrary("com.google.guava" % "guava" % "20.0")
    .inLibrary("io.grpc" % "grpc-netty" % "1.1.2")
)

因此您无需更改userClassPathFirst

此外,您可以像这样简化着色规则:

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("com.google.common.**" -> "my_conf.@1").inAll
)

由于org.apache.spark依赖项为provided,它们将不会包含在您的jar中并且不会被着色(因此spark将使用它自己在集群上的无阴影版本的guava)。 / p>