当我使用scala创建Spark上下文时,会显示以下跟踪:
MyVar
我读过关于netty版本冲突的任何内容,但我无法解决这个问题。
这是我的一组依赖项:
[sparkDriver-akka.actor.default-dispatcher-3] ERROR akka.actor.ActorSystemImpl - Uncaught fatal error from thread [sparkDriver-akka.remote.default-remote-dispatcher-5] shutting down ActorSystem [sparkDriver]
java.lang.NoSuchMethodError: org.jboss.netty.channel.socket.nio.NioWorkerPool.<init>(Ljava/util/concurrent/Executor;I)V
at akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:283)
at akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:240)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
at scala.util.Try$.apply(Try.scala:161)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at scala.util.Success.flatMap(Try.scala:200)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:692)
at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:684)
at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
at akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:684)
at akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:492)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at akka.remote.EndpointManager.aroundReceive(Remoting.scala:395)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
很抱歉,但我不能更加冗长,因为我完全不知道这个话题。
如果知道发生了什么事......
已更新
我只是用cassandra支持初始化一个spark上下文:
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.2.2"
exclude ("com.esotericsoftware.minlog", "minlog")
exclude("org.eclipse.jetty.orbit", "javax.transaction")
exclude("org.eclipse.jetty.orbit", "javax.mail.glassfish")
exclude ("commons-beanutils", "commons-beanutils-core")
exclude ("commons-digester", "commons-digester")
exclude ("org.slf4j", "jcl-over-slf4j"),
"org.apache.spark" %% "spark-streaming" % "1.2.2",
"org.apache.spark" %% "spark-streaming-flume" % "1.2.2" exclude ("org.mortbay.jetty", "servlet-api"),
"org.apache.spark" %% "spark-mllib" % "1.2.2",
"com.datastax.spark" %% "spark-cassandra-connector" % "1.2.0" withSources() withJavadoc(),
"org.scalatest" % "scalatest_2.10" % "2.2.1" % "test",
"org.cassandraunit" % "cassandra-unit" % "2.1.3.1" % "test",
"org.apache.cassandra" % "cassandra-all" % "2.1.3",
"com.bitmonlab.nrich" % "spark-jobserver-api" % "0.5.0"
)
答案 0 :(得分:0)
存在依赖性问题。正在将Avro-Tools jar文件导入到项目中,导致出现ERROR。感谢大家。
答案 1 :(得分:0)
有一个类似的问题,但使用maven而不是sbt。由于我将avro-ipc作为我的依赖项之一,我需要排除netty,因此,它看起来像这样。
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-ipc</artifactId>
<version>${avro.version}</version>
<exclusions>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty</artifactId>
</exclusion>
</exclusions>
</dependency>