运行时出错,sbt编译通过

时间:2016-01-29 02:54:44

标签: scala sbt spark-streaming sbt-assembly

我有一段代码可以编译(Scala + Spark 1.6)。 然后我运行它(使用Spark 1.6),但它抱怨1.6方法不在那里。是什么给了??

simple.sbt:

name := "Simple Project"

version := "1.0"

scalaVersion := "2.10.4"

resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"
resolvers += "Conjars" at "http://conjars.org/repo"
resolvers += "cljars" at "https://clojars.org/repo/"

mainClass in Compile := Some("Medtronic.Class")

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.6.0"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "1.7.2"
libraryDependencies += "org.elasticsearch" %% "elasticsearch-spark" % "2.1.1"
libraryDependencies += "com.github.nscala-time" %% "nscala-time" % "1.8.0"

汇编:

$ sbt assembly
[info] Loading project definition from /Users/mlieber/projects/spark/test/project
[info] Set current project to Simple Project (in build file:/Users/mlieber/projects/spark/test/)
[info] Updating {file:/Users/mlieber/projects/spark/test/}test...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[warn] Scala version was updated by one of library dependencies:
[warn]  * org.scala-lang:scala-library:(2.10.4, 2.10.0) -> 2.10.5
[warn] To force scalaVersion, add the following:
[warn]  ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn]  * org.apache.spark:spark-core_2.10:1.4.1 -> 1.6.0
[warn] Run 'evicted' to see detailed eviction warnings
..

[info] Run completed in 257 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
..
[info] Including from cache: spark-core_2.10-1.6.0.jar
..
[info] Including from cache: spark-streaming_2.10-1.6.0.jar
..
[info] Assembly up to date: /Users/mlieber/projects/spark/test/target/scala-2.10/stream_test_1.0.jar
[success] Total time: 98 s, completed Jan 28, 2016 4:05:22 PM

我用:

运行
./app/spark-1.6.0-bin-hadoop2.6/bin/spark-submit --jars /Users/mlieber/app/elasticsearch-1.7.2/lib/elasticsearch-1.7.2.jar  --master local[4] --class "MyClass"    ./target/scala-2.10/stream_test_1.0.jar 

编译错误:

    Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.streaming.dstream.PairDStreamFunctions.mapWithState(Lorg/apache/spark/streaming/StateSpec;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/streaming/dstream/MapWithStateDStream;    
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
..
    16/01/28 18:35:23 INFO SparkContext: Invoking stop() from shutdown hook

1 个答案:

答案 0 :(得分:2)

您的项目受Dependency Hell影响。发生的事情是SBT默认解析传递依赖关系,并且您的一个依赖关系(elasticsearch-spark)需要另一个版本的spark-core。从您的日志中:

[warn] Here are some of the libraries that were evicted:
[warn]  * org.apache.spark:spark-core_2.10:1.4.1 -> 1.6.0

看起来elasticsearch-spark所需的版本与项目使用的版本不是二进制兼容的,那么当项目运行时会出现错误。

编译时没有错误,因为正在编译的代码(也就是您的代码)与已解析的版本兼容。

以下是有关如何解决此问题的一些选项:

  1. 您可以尝试将elasticsearch-spark升级到版本2.1.2,看看它是否带来了spark-core的更新版本(可能与您的项目兼容)。版本2.2.0-rc1 depends on spark-core 1.6.0以及对此版本的升级肯定会解决问题,但请记住,您将使用候选发布版本。
  2. 您可以尝试将spark-corespark-streaming降级为版本1.4.1elasticsearch-spark使用的版本}),并在必要时调整您的代码。