java.lang.NoSuchMethodError:breeze.linalg.tile $ .tile_DM_Impl2

时间:2017-09-15 21:51:24

标签: scala apache-spark apache-spark-ml scala-breeze

我有一个使用微风的火花代码。我可以看看我的项目的微风版本:

$ gradle dependencies | grep breeze
     |    |    +--- org.scalanlp:breeze_2.11:0.12
     |    |    |    +--- org.scalanlp:breeze-macros_2.11:0.12
     +--- org.scalanlp:breeze_2.11:0.12 (*)
     |    |    +--- org.scalanlp:breeze_2.11:0.12
     |    |    |    +--- org.scalanlp:breeze-macros_2.11:0.12
     +--- org.scalanlp:breeze_2.11:0.12 (*)
     |    |    +--- org.scalanlp:breeze_2.11:0.12
     |    |    |    +--- org.scalanlp:breeze-macros_2.11:0.12
     +--- org.scalanlp:breeze_2.11:0.12 (*)
|    |    |    +--- org.scalanlp:breeze_2.11:0.12
|    |    |    |    +--- org.scalanlp:breeze-macros_2.11:0.12
|    +--- org.scalanlp:breeze_2.11:0.12 (*)
|    |    |    +--- org.scalanlp:breeze_2.11:0.12
|    |    |    |    +--- org.scalanlp:breeze-macros_2.11:0.12
|    +--- org.scalanlp:breeze_2.11:0.12 (*)

spark 2.1.1中包含的微风版本为0.12。我可以在spark jars目录中看到这个:

spark-2.1.1-bin-hadoop2.4$ find . -name *.jar | grep breeze
./jars/breeze_2.11-0.12.jar
./jars/breeze-macros_2.11-0.12.jar

但是当我将工作提交给spark(甚至是本地)时,我收到了这个错误:

java.lang.NoSuchMethodError: breeze.linalg.tile$.tile_DM_Impl2(Lscala/reflect/ClassTag;Lbreeze/storage/Zero;Lbreeze/generic/UFunc$InPlaceImpl2;)Lbreeze/generic/UFunc$UImpl2;
    at mypackage.MyClass.calcOne(MyClass.scala:51)
    at mypackage.MyClass$$anonfun$1.apply(MyClass.scala:36)
    at mypackage.MyClass$$anonfun$1.apply(MyClass.scala:35)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
    at scala.collection.Iterator$class.foreach(Iterator.scala:893)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
    at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
    at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1336)
    at scala.collection.TraversableOnce$class.fold(TraversableOnce.scala:212)
    at scala.collection.AbstractIterator.fold(Iterator.scala:1336)
    at org.apache.spark.rdd.RDD$$anonfun$fold$1$$anonfun$20.apply(RDD.scala:1044)

使用的命令行:

spark-2.1.1-bin-hadoop2.4/bin/spark-submit --class my.Main myjar.jar

1 个答案:

答案 0 :(得分:0)

发现问题:

我的 SPARK_HOME 环境变量设置为旧火花版本。

所以 bin / spark-class 正在寻找其他路径中的jar依赖