在Flink群集上找不到FlinkKafkaConsumer011

时间:2018-07-23 13:11:16

标签: scala apache-flink flink-streaming

我正在尝试在群集上运行Flink作业。这项工作在我的开发(本地)环境中运行良好。但是,当我使用以下命令将其部署到群集上时:

./bin/flink run -c org.example.CointegrationOfPairs ../coint.jar

它失败并显示错误:

java.lang.NoClassDefFoundError: org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumer011
    at org.example.CointegrationOfPairs$.main(CointegrationOfPairs.scala:38)
    at org.example.CointegrationOfPairs.main(CointegrationOfPairs.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
    at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:420)
    at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:404)
    at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:785)
    at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:279)
    at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:214)
    at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1025)
    at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1101)
    at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
    at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1101)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer011
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

我也添加了必需的依赖项

val flinkDependencies = Seq(
  "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-connector-kafka-0.11" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-ml" % flinkVersion % "provided"
)

我正在使用sbt clean assembly

构建jar文件

1 个答案:

答案 0 :(得分:3)

连接器未包含在Flink的二进制发行版中,以避免其依赖项和用户代码的版本冲突。因此,默认情况下,相应的类不会加载到Flink进程的类路径中。

有两种方法可以解决此问题:

  1. 请勿按提供的那样设置flink-connnector-kafka依赖项。而是构建一个包含连接器依赖项的胖子。这样,连接器将与您的应用程序一起提供。这是首选方法。

  2. flink-connector-kafka依赖项的jar文件添加到Flink设置的./lib文件夹中。这将分发文件并将其包含在Flink进程的类路径中。