我正在尝试执行Flink中的基本WordCount示例,该示例位于以下链接中 - Link
当我从eclipse运行它时,它失败了以下异常 -
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.flink.core.memory.MemorySegmentFactory.initializeFactory(Lorg/apache/flink/core/memory/MemorySegmentFactory$Factory;)V
at org.apache.flink.runtime.taskmanager.TaskManager$.parseTaskManagerConfiguration(TaskManager.scala:1936)
at org.apache.flink.runtime.taskmanager.TaskManager$.startTaskManagerComponentsAndActor(TaskManager.scala:1684)
at org.apache.flink.runtime.minicluster.LocalFlinkMiniCluster.startTaskManager(LocalFlinkMiniCluster.scala:118)
at org.apache.flink.runtime.minicluster.FlinkMiniCluster$$anonfun$2.apply(FlinkMiniCluster.scala:270)
at org.apache.flink.runtime.minicluster.FlinkMiniCluster$$anonfun$2.apply(FlinkMiniCluster.scala:263)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.Range.foreach(Range.scala:141)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:263)
at org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:236)
at org.apache.flink.client.LocalExecutor.start(LocalExecutor.java:115)
at org.apache.flink.client.LocalExecutor.executePlan(LocalExecutor.java:173)
at org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:91)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:926)
at org.apache.flink.api.java.DataSet.collect(DataSet.java:410)
at org.apache.flink.api.java.DataSet.print(DataSet.java:1605)
at org.hemant.notifier.stream.WordCount.main(WordCount.java:74)
知道如何解决这个问题吗?
由于
答案 0 :(得分:0)
MemorySegmentFactory
类是flink-core
的一部分。您的问题可能是您忘记了一些flink依赖项。
<强>行家/ JAVA 强>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.1.4</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>1.1.4</version>
</dependency>
(如果您使用scala 2.10,请将版本设置为您的flink版本并将_2.11
更改为_2.10
<强> SBT /阶强>
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.flink" %% "flink-scala" % "1.2.0",
"org.apache.flink" %% "flink-clients" % "1.2.0"
)
(再次,更新版本以匹配您的版本)
答案 1 :(得分:0)
对我来说,这是依赖顺序。
flink-java
应该在依赖项列表中flink-clients
或flink-test-utils
的之后。