Flink Kafka Connector运行时错误

时间:2016-10-12 22:22:23

标签: apache-kafka apache-flink

我正在使用:

  • flink 1.1.2
  • Kafka 2.10-0.10.0.1
  • 弗林克连接器-卡夫卡0.9.2.10-1.0.0

我正在使用以下非常简单/基本的应用

select f1.user1_id as user1, f2.user1_id as user2, count(f1.user2_id) as 
mutual_count from Friends f1 JOIN Friends f2 ON 
f1.user2_id = f2.user2_id AND f1.user1_id <> f2.user1_id  GROUP BY
f1.user1_id, f2.user1_id order by mutual_count desc

使用maven编译后,当我尝试使用以下命令运行时:

Properties properties = new Properties();                               
properties.setProperty("bootstrap.servers", "localhost:33334");         

properties.setProperty("partition.assignment.strategy", "org.apache.kafka.clients.consumer.RangeAssignor");
properties.setProperty("group.id", "test");                             
String topic = "mytopic";                                                

FlinkKafkaConsumer09<String> fkc =                                      
    new FlinkKafkaConsumer09<String>(topic, new SimpleStringSchema(), properties);

DataStream<String> stream = env.addSource(fkc);    
env.execute()

我看到以下运行时错误:

bin/flink run -c  com.mycompany.app.App fkaf/target/fkaf-1.0-SNAPSHOT.jar

为什么没有找到方法assign()?方法就在那里 LIB /卡夫卡的客户端 - 0.10.0.1.jar。

Submitting job with JobID: f6e290ec7c28f66d527eaa5286c00f4d. Waiting for job completion.
Connected to JobManager at Actor[akka.tcp://flink@127.0.0.1:6123/user/jobmanager#-1679485245]
10/12/2016 15:10:06     Job execution switched to status RUNNING.
10/12/2016 15:10:06     Source: Custom Source(1/1) switched to SCHEDULED 
10/12/2016 15:10:06     Source: Custom Source(1/1) switched to DEPLOYING 
10/12/2016 15:10:06     Map -> Sink: Unnamed(1/1) switched to SCHEDULED 
10/12/2016 15:10:06     Map -> Sink: Unnamed(1/1) switched to DEPLOYING 
10/12/2016 15:10:06     Source: Custom Source(1/1) switched to RUNNING 
10/12/2016 15:10:06     Map -> Sink: Unnamed(1/1) switched to RUNNING 
10/12/2016 15:10:06     Map -> Sink: Unnamed(1/1) switched to CANCELED 
10/12/2016 15:10:06     Source: Custom Source(1/1) switched to FAILED 
java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign(Ljava/util/List;)V
    at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.open(FlinkKafkaConsumer09.java:282)
    at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:38)
    at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:91)
    at org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:376)
    at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:256)
    at org.apache.flink.runtime.taskmanager.Task.run(Task.java:584)
    at java.lang.Thread.run(Thread.java:722)

1 个答案:

答案 0 :(得分:1)

NoSuchMethodError表示版本不匹配。

我猜想问题是你尝试将Kafka 0.9使用者连接到Kafka 0.10实例。 Flink 1.1.x不提供Kafka 0.10消费者。但是,即将发布的1.2.0版本中将包含0.10的消费者。

您可以尝试从当前的主分支(1.2-SNAPSHOT)自己构建Kafka 0.10使用者,并将其与Flink 1.1.2一起使用。相应的Flink API应该是稳定的,并且从1.2到1.1可以向后兼容。