当我在终端中运行时:
sudo spark-submit --master local --class xxx.xxxx.xxx.xxxx.xxxxxxxxxxxxJob --conf 'spark.driver.extraJavaOptions=-Dconfig.resource=xxx.conf' /home/xxxxx/workspace/prueba/pruebas/target/scala-2.11/MiPrueba.jar
我收到以下错误:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps; at pureconfig.DurationUtils$.words(DurationUtils.scala:36) at pureconfig.DurationUtils$.pureconfig$DurationUtils$$expandLabels(DurationUtils.scala:38) at pureconfig.DurationUtils$$anonfun$2.apply(DurationUtils.scala:53) at pureconfig.DurationUtils$$anonfun$2.apply(DurationUtils.scala:53) at scala.collection.immutable.List.flatMap(List.scala:338) at pureconfig.DurationUtils$.(DurationUtils.scala:53) at pureconfig.DurationUtils$.(DurationUtils.scala) at pureconfig.DurationReaders$class.$init$(BasicReaders.scala:114) at pureconfig.ConfigReader$.(ConfigReader.scala:121) at pureconfig.ConfigReader$.(ConfigReader.scala) at xxx.xxxx.xxx.xxxx.config.package$Config$.load(package.scala:67) at xxx.xxxx.xxx.xxxx.job.xxxxJob$class.main(XXXxxx.scala:23) at xxx.xxxx.xxx.xxxx......Job$.main(Xxxxxxxxxxxx.scala:19) at xxx.xxxx.xxx.xxxx..main(XXXXXXxxxxxxxx.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
内部定义:
版本:=“ 0.1”
scalaVersion:=“ 2.11.11”
libraryDependencies:
val dependFullList = spark ++ hadoop ++ apisDownload ++ configuration
配置:
val configuration = Seq(
"com.github.pureconfig" %% "pureconfig" % "0.9.2",
"com.typesafe" % "config" % "1.3.1",
"org.lz4" % "lz4-java" % "1.4.1"
)
火花:
val spark = Seq(
"org.apache.spark" %% "spark-core" % Versions.spark % "provided" exclude("javax.jms", "jms"),
"org.apache.spark" %% "spark-sql" % Versions.spark % "provided",
"com.databricks" %% "spark-xml" % "0.4.1"
// https://mvnrepository.com/artifact/mrpowers/spark-daria
)
有什么想法吗?
答案 0 :(得分:1)
您正在混合Scala版本。 Spark 2.4.2不支持Scala 2.11。切换到Spark 2.4.0或用scala 2.12版本替换您的库。
https://spark.apache.org/releases/spark-release-2-4-2.html
请注意,从2.4.1开始不推荐使用Scala 2.11。从2.4.2版本开始,针对Scala 2.12编译了预构建的便捷二进制文件。 Spark仍在Maven Central中针对2.11和2.12交叉发布,并且可以从源代码针对2.11构建。