我的应用程序使用的是commons-configuration2和commons-beanutils1.9,但是当我尝试将我的应用程序jar用于spark流程作业时,它会抛出以下异常。
java.lang.NoSuchMethodError:org.apache.commons.beanutils.PropertyUtilsBean.addBeanIntrospector(Lorg / apache / commons / beanutils / BeanIntrospector;)V 在org.apache.commons.configuration2.beanutils.BeanHelper.initBeanUtilsBean(BeanHelper.java:631) 在org.apache.commons.configuration2.beanutils.BeanHelper。(BeanHelper.java:89) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:264) 在com.sun.proxy。$ Proxy23。(未知来源) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 在java.lang.reflect.Proxy.newProxyInstance(Proxy.java:739) 在org.apache.commons.configuration2.builder.fluent.Parameters.createParametersProxy(Parameters.java:294) 在org.apache.commons.configuration2.builder.fluent.Parameters.fileBased(Parameters.java:185)
这是我的build.sbt
libraryDependencies ++= Seq(
"org.apache.commons" % "commons-configuration2" % "2.0",
"commons-beanutils" % "commons-beanutils" % "1.9.2",
"com.databricks" % "spark-avro_2.10" % "2.0.1",
"com.databricks" % "spark-csv_2.10" % "1.4.0",
"org.apache.spark" % "spark-sql_2.10" % "1.5.0" % "provided",
"org.apache.spark" % "spark-hive_2.10" % "1.4.1" % "provided",
"org.apache.spark" % "spark-core_2.10" % "1.4.1" % "provided",
"com.amazonaws" % "aws-java-sdk" % "1.10.61",
"org.apache.logging.log4j" % "log4j-api" % "2.6.2",
"org.jasypt" % "jasypt" % "1.9.2",
"commons-codec" % "commons-codec" % "1.8",
"org.apache.kafka" % "kafka-clients" % "0.10.0.0",
"org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.3",
"org.apache.spark" % "spark-streaming_2.10" % "1.6.3" excludeAll(ExclusionRule(organization = "commons-beanutils"))
)
dependencyOverrides ++= Set(
"com.fasterxml.jackson.core" % "jackson-databind" % "2.4.4",
"org.apache.logging.log4j" % "log4j-api" % "2.6.2",
"org.apache.logging.log4j" % "log4j-core" % "2.6.2",
"org.apache.commons" % "commons-configuration2" % "2.0",
"commons-beanutils" % "commons-beanutils" % "1.9.2"
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
如何确保它使用commons-beanutils-1.9.2而不是commons-beanutils-1.7或commons-beanutils-core-1.8,它们是hadoop-common的一部分?
答案 0 :(得分:0)
在项目设置中排除不需要的广告系列对我有用:
...
.settings(assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
val excludes = Set(
"commons-beanutils-core-1.8.0.jar",
"commons-beanutils-1.7.0.jar",
"commons-beanutils-1.8.0.jar"
)
cp.filter{jar => excludes.contains(jar.data.getName)}
})