我正在开发一个scala独立应用程序,它正在更新Redis服务器(将此应用程序作为Spark作业执行)。我的应用程序的Scala版本是2.10。
我使用“net.debasishg”%“redisclient_2.10”%“2.13”与Redis服务器一起使用。
我通过IntelliJ构思运行应用程序,应用程序正在运行而没有任何错误。然后我通过提供“activator package”命令创建一个我的应用程序的jar。然后我执行jar文件,给出错误如下,
Exception in thread "main" java.lang.NoClassDefFoundError: com/redis/RedisClient
at Main$.main(Main.scala:55)
at Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.redis.RedisClient
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 9 more
请有人知道如何克服这个问题。 (我尝试使用sbt-assembly插件但遗憾的是它没有解决我的问题)
谢谢
答案 0 :(得分:0)
我已解决此问题,如下所示,
创建一个assembly.sbt文件并添加到应用程序的项目目录中。此文件包含“org.apache.spark”%%“spark-core”%“1.0.0”插件。
在build.sbt文件中添加了以下行
scalaVersion:=“2.10.4”
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "2.1.6" % "test",
("org.apache.spark" %% "spark-core" % "1.0.0").
exclude("org.eclipse.jetty.orbit", "javax.servlet").
exclude("org.eclipse.jetty.orbit", "javax.transaction").
exclude("org.eclipse.jetty.orbit", "javax.mail").
exclude("org.eclipse.jetty.orbit", "javax.activation").
exclude("commons-beanutils", "commons-beanutils-core").
exclude("commons-collections", "commons-collections").
exclude("commons-collections", "commons-collections").
exclude("com.esotericsoftware.minlog", "minlog"),
("org.apache.hadoop" % "hadoop-client" % " 2.2.0").
exclude("com.twitter", "parquet-column"),
"org.apache.spark" % "spark-sql_2.10" % "1.2.0",
"org.apache.spark" % "spark-hive_2.10" % "1.2.0",
"net.debasishg" % "redisclient_2.10" % "2.14"
excludeAll(
ExclusionRule(organization = "org.slf4j"),
ExclusionRule(organization = "com.twitter")
)
)
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) => {
case PathList("javax", "servlet", xs@_*) => MergeStrategy.first
case PathList(ps@_*) if ps.last endsWith ".html" => MergeStrategy.first
case PathList(ps@_*) if ps.last endsWith ".properties" => MergeStrategy.first
case PathList(ps@_*) if ps.last endsWith ".xml" => MergeStrategy.first
case PathList(ps@_*) if ps.last endsWith ".class" => MergeStrategy.first
case PathList(ps@_*) if ps.last endsWith ".thrift" => MergeStrategy.first
case "application.conf" => MergeStrategy.concat
case "unwanted.txt" => MergeStrategy.discard
case x => old(x)
}
}
最后使用“activator clean”命令清理应用程序,然后运行“activator assembly”打包项目
由于