flink:scala版本冲突?

时间:2015-10-25 20:01:34

标签: java scala intellij-idea apache-flink

我试图从IntelliJ中的here编译kafka示例。在经历了很多与依赖关系的问题后,我似乎无法解决这个问题:

OPTIONS

我遇到了一些暗示这是scala版本问题的概念。当前库列表:

15/10/25 12:36:34 ERROR actor.ActorSystemImpl: Uncaught fatal error from thread [flink-akka.actor.default-dispatcher-4] shutting down ActorSystem [flink]
java.lang.NoClassDefFoundError: scala/runtime/AbstractPartialFunction$mcVL$sp
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.flink.runtime.jobmanager.MemoryArchivist.handleMessage(MemoryArchivist.scala:80)
at org.apache.flink.runtime.FlinkActor$class.receive(FlinkActor.scala:32)
at org.apache.flink.runtime.jobmanager.MemoryArchivist.org$apache$flink$runtime$LogMessages$$super$receive(MemoryArchivist.scala:59)
at org.apache.flink.runtime.LogMessages$class.receive(LogMessages.scala:26)
at org.apache.flink.runtime.jobmanager.MemoryArchivist.receive(MemoryArchivist.scala:59)
at akka.actor.ActorCell.newActor(ActorCell.scala:567)
at akka.actor.ActorCell.create(ActorCell.scala:587)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:460)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:482)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:282)
at akka.dispatch.Mailbox.run(Mailbox.scala:223)
at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)


Caused by: java.lang.ClassNotFoundException: scala.runtime.AbstractPartialFunction$mcVL$sp
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 28 more

关于我在哪里误入歧途的建议?

1 个答案:

答案 0 :(得分:5)

问题确实是Scala版本不匹配。您正在混合为Scala 2.11构建的依赖项,例如org.apache.kafka:kafka_2.11:0.8.2.2具有Flink依赖项,默认情况下为Scala构建2.10

为Scala 2.11构建的一个依赖项引入了scala-library:2.11 jar,它取代了Flink依赖项所需的scala-library:2.10依赖项。您可以将为Scala 2.10构建的二进制文件用于非Flink依赖项,也可以使用Scala 2.11构建和安装Flink。有关如何使用不同的Scala版本构建Flink,请参阅https://ci.apache.org/projects/flink/flink-docs-master/setup/building.html#build-flink-for-a-specific-scala-version

Kafka示例

如果您只想将引用的Kafka示例的版本提升为0.10-SNAPSHOT,则必须更改pom.xml文件中的Flink版本,并且必须使用FlinkKafkaProducer代替KafkaSink文件中的WriteIntoKafka.java。您不再需要SimpleStringSchema了。这就是你需要改变的全部内容(不需要额外的依赖项)。