线程" main"中的例外情况java.lang.NoSuchMethodError:org.apache.commons.csv.CSVParser.parse

时间:2017-01-27 19:40:17

标签: csv apache-spark sbt

运行程序时出现此错误:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.commons.csv.CSVParser.parse

这是我的SBT程序集文件:

name := "mytest"

version := "1.0"

scalaVersion := "2.10.6"

organization := "org.test"

val sparkVersion = "1.6.1"

val mahoutVersion = "0.12.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-mllib" % sparkVersion,
  // Mahout's Spark libs
  "org.apache.mahout" %% "mahout-math-scala" % mahoutVersion,
  "org.apache.mahout" %% "mahout-spark" % mahoutVersion
    exclude("org.apache.spark", "spark-core_2.10"),
  "org.apache.mahout"  % "mahout-math" % mahoutVersion,
  "org.apache.mahout"  % "mahout-hdfs" % mahoutVersion
    exclude("com.thoughtworks.xstream", "xstream")
    exclude("org.apache.hadoop", "hadoop-client"),
  // other external libs
  "com.databricks" % "spark-csv_2.10" % "1.3.2",
  "com.github.nscala-time" %% "nscala-time" % "2.16.0"
    exclude("org.apache.commons", "commons-csv"),
  "org.elasticsearch" % "elasticsearch" % "2.3.0",
  "org.elasticsearch" % "elasticsearch-spark_2.10" % "2.3.0"
    exclude("org.apache.spark", "spark-catalyst_2.10")
    exclude("org.apache.spark", "spark-sql_2.10"))

resolvers += "typesafe repo" at " http://repo.typesafe.com/typesafe/releases/"

resolvers += Resolver.mavenLocal

assemblyMergeStrategy in assembly := {
  case "plugin.properties" => MergeStrategy.discard
  case PathList("org", "joda", "time", "base", "BaseDateTime.class") => MergeStrategy.first
  case PathList("org", "apache", "commons", "csv", "CSVParser.class") => MergeStrategy.first
  case PathList("org", "apache", "commons", "csv", "CSVPrinter.class") => MergeStrategy.first
  case PathList("org", "apache", "commons", "csv", "ExtendedBufferedReader.class") => MergeStrategy.last
  case PathList(ps @ _*) if ps.last endsWith "package-info.class" =>
    MergeStrategy.first
  case x =>
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}

我还测试了"com.databricks" % "spark-csv_2.10" % "1.5.0""com.databricks" % "spark-csv_2.10" % "1.4.0",但始终出现相同的错误。我知道它与依赖关系有关。我是否需要添加任何其他库?

2 个答案:

答案 0 :(得分:0)

这看起来像一个有问题的类路径。

我会避免使用" assemblyMergeStrategy"像这样修复类路径。如果你有配置文件冲突,如log4j,它可以正常工作,但如果你有这种混乱,它真的不适合这项工作。

建议的解决方案: 在使用exclude("org.apache.commons", "commons-csv")的所有依赖项中使用commons-csv。只留下你真正需要的那个(在这种情况下是来自spark的那个)。

我总体上会尝试使用排除规则来修复类路径,而不必使用" assemblyMergeStrategy"。

答案 1 :(得分:0)

后人:

如果您对apache-solr有依赖性,则可能与依赖项solr-commons-csv.jar发生冲突,该依赖项嵌入了具有相同名称的类(org.apache.commons.csv.CSVParser