使用SparkContext的NoSuchMethodError

时间:2015-08-21 22:16:20

标签: scala maven apache-spark

当我的Spark应用程序执行NoSuchMethodError时,我得到val sc = new SparkContext("spark://spark01:7077", "Request Executor")。我正在使用1.3.1版本和Scala版本2.10.4编译我的Spark应用程序。 Spark集群使用1.3.1以及相同的Scala版本进行编译。

从查看Spark源代码开始,getTimeAsSeconds中的Utils.scala在Spark 1.4之前不存在。为什么它试图调用I版本中不存在的方法?

以下是我pom.xml的依赖关系:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.10</artifactId>
  <version>1.3.1</version>
</dependency>

<dependency>
  <groupId>org.scala-lang</groupId>
  <artifactId>scala-library</artifactId>
  <version>2.10.4</version>
</dependency>
<dependency>
  <groupId>org.scala-lang</groupId>
  <artifactId>scala-compiler</artifactId>
  <version>2.10.4</version>
</dependency>
<dependency>
  <groupId>org.scala-lang</groupId>
  <artifactId>scala-reflect</artifactId>
  <version>2.10.4</version>
</dependency>

<dependency>
  <groupId>com.twitter</groupId>
  <artifactId>util-eval_2.10</artifactId>
  <version>6.26.0</version>
</dependency>

<dependency>
  <groupId>com.google.guava</groupId>
  <artifactId>guava</artifactId>
  <version>19.0-rc1</version>
</dependency>

<dependency>
  <groupId>org.jvnet.jaxb2_commons</groupId>
  <artifactId>jaxb2-basics-runtime</artifactId>
<version>0.7.0</version>
</dependency>

<!-- Jackson JSON Library -->
<dependency>
  <groupId>com.fasterxml.jackson.core</groupId>
  <artifactId>jackson-core</artifactId>
  <version>2.4.4</version>
</dependency>
<dependency>
  <groupId>com.fasterxml.jackson.core</groupId>
  <artifactId>jackson-databind</artifactId>
  <version>2.4.4</version>
</dependency>
<dependency>
  <groupId>com.fasterxml.jackson.core</groupId>
  <artifactId>jackson-annotations</artifactId>
  <version>2.4.4</version>
</dependency>
<dependency>
  <groupId>com.fasterxml.jackson.module</groupId>
  <artifactId>jackson-module-jaxb-annotations</artifactId>
  <version>2.4.4</version>
</dependency>

<dependency>
  <groupId>com.fasterxml.jackson.dataformat</groupId>
  <artifactId>jackson-dataformat-xml</artifactId>
</dependency>
<dependency>
  <groupId>org.codehaus.woodstox</groupId>
  <artifactId>woodstox-core-asl</artifactId>
  <version>4.1.4</version>
</dependency>

<dependency>
  <groupId>uk.org.simonsite</groupId>
  <artifactId>log4j-rolling-appender</artifactId>
  <version>20131024-2017</version>
</dependency>

<dependency>
  <groupId>junit</groupId>
  <artifactId>junit</artifactId>
  <version>4.12</version>
</dependency>

<dependency>
  <groupId>org.eclipse.jetty</groupId>
  <artifactId>jetty-server</artifactId>
  <version>9.3.1.v20150714</version>
</dependency>

<dependency>
  <groupId>org.eclipse.jetty</groupId>
  <artifactId>jetty-servlet</artifactId>
  <version>9.3.1.v20150714</version>
</dependency>

<dependency>
  <groupId>org.eclipse.jetty</groupId>
  <artifactId>jetty-webapp</artifactId>
  <version>9.3.1.v20150714</version>
</dependency> 

<dependency>
  <groupId>org.antlr</groupId>
  <artifactId>antlr4-runtime</artifactId>
  <version>4.5</version>
</dependency>

<dependency>
  <groupId>org.apache.solr</groupId>
  <artifactId>solr-solrj</artifactId>
  <version>5.2.1</version>
</dependency>

<dependency>
  <groupId>org.apache.solr</groupId>
  <artifactId>solr-core</artifactId>
  <version>5.2.1</version>
  <exclusions>
    <exclusion>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-jdk14</artifactId>
    </exclusion>
    <exclusion>
      <groupId>org.eclipse.jetty</groupId>
      <artifactId>jetty-util</artifactId>
    </exclusion>
  </exclusions>
</dependency>

我的依赖项中的某些内容是否导致我使用Spark 1.4进行编译?

堆栈跟踪:

java.lang.NoSuchMethodError: org.apache.spark.network.util.JavaUtils.timeStringAsSec(Ljava/lang/String;)J
        at org.apache.spark.util.Utils$.timeStringAsSeconds(Utils.scala:1027)
        at org.apache.spark.SparkConf.getTimeAsSeconds(SparkConf.scala:194)
        at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:68)
        at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
        at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
        at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:166)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)
        at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
        at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)
        at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)
        at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)
        at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:155)
        at com.scala.analytics.RequestExecutor$.executeRequest(RequestExecutor.scala:23)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:735)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
        at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:816)
        at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)
        at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1113)
        at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)
        at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1047)
        at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
        at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
        at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:119)
        at org.eclipse.jetty.server.Server.handle(Server.java:517)
        at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:302)
        at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:242)
        at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:238)
        at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
        at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:57)
        at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceAndRun(ExecuteProduceConsume.java:213)
        at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:147)
        at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:654)
        at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572)
        at java.lang.Thread.run(Thread.java:745)

1 个答案:

答案 0 :(得分:1)

事实证明这是一个愚蠢的错误。我在另一台机器上运行此应用程序,因此每次编译时都会复制目标目录。但是,我从来没有清理过远程机器上的目标目录,所以我有一个旧的jar,Spark 1.4.0就坐在那里,我曾经用过它。每次我的应用程序运行时,它都会查找一个Spark jar并使用1.4.0 jar而不是同样位于目录中的1.3.1 jar。解决方案只是删除旧的(1.4.0)jar。