如何修复Flink'错误:无法推断FlinkKafkaConsumer011 <>的类型参数

时间:2019-05-23 16:36:51

标签: java apache-flink

我正在跟踪在Kafka中使用Flink的示例。我只发现诸如this page之类的结果,这些结果无法正确编译,并且会产生难以查找的错误消息。

基本上,当我尝试编译此代码片段时,出现错误:

import org.apache.flink.streaming.util.serialization.DeserializationSchema;
import org.apache.flink.streaming.util.serialization.SimpleStringSchema;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer011;

import java.util.Properties;

public final class Main {

    public static FlinkKafkaConsumer011<String> createStringConsumerForTopic(
            String topic, String kafkaAddress, String kafkaGroup ) {
        Properties props = new Properties();
        props.setProperty("bootstrap.servers", kafkaAddress);
        props.setProperty("group.id",kafkaGroup);
        FlinkKafkaConsumer011<String> consumer =
                new FlinkKafkaConsumer011<>(topic, new SimpleStringSchema(),props);

        return consumer;
    }
}

这是我的依赖项,例如build.gradle文件中的内容:

group 'myapp'
version '1.0-SNAPSHOT'

apply plugin: 'java'
sourceCompatibility = 1.8

repositories {
  jcenter()
}

dependencies {
  ecj 'org.eclipse.jdt.core.compiler:ecj:4.6.1'
  compile group: 'org.apache.flink', name: 'flink-streaming-java_2.11', version: '1.2.0'
  compile group: 'org.apache.flink', name: 'flink-java', version: '1.5.0'
  compile group: 'org.apache.flink', name: 'flink-clients_2.11', version: '1.5.0'
  compile group: 'org.apache.flink', name: 'flink-avro', version: '1.8.0'
  compile group: 'org.apache.flink', name: 'flink-core', version: '1.5.0'
  compile group: 'org.apache.flink', name: 'flink-connector-kafka-0.11_2.11', version: '1.5.0'

  compile group: 'org.apache.kafka', name: 'kafka_2.11', version: '1.1.0'
  compile group: 'org.apache.kafka', name: 'kafka-clients', version: '1.1.0'

  compile group: 'com.google.code.gson', name: 'gson', version: '2.8.5'
}

这是使用构建工具运行代码时出现的错误:

$ gradle build
> Task :compileJava FAILED
/Users/john/dev/john/flink-example/src/main/java/com/company/opi/flinkexample/Main.java:55: error: cannot infer type arguments for FlinkKafkaConsumer011<>
                new FlinkKafkaConsumer011<>(topic, new SimpleStringSchema(),props);
                ^
Note: /Users/john/dev/john/flink-example/src/main/java/com/company/opi/flinkexample/EnvironmentConfig.java uses unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
1 error


FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':compileJava'.
> Compilation failed; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 3s
1 actionable task: 1 executed

这是他的source code的链接。

1 个答案:

答案 0 :(得分:0)

一个问题:您使用的所有flink库都应具有相同的版本号-您似乎正在混合使用1.2.0、1.5.0和1.8.0版本。 以下是将正确编译的更新后的依赖关系和源代码。

(build.gradle)

group 'myapp'
version '1.0-SNAPSHOT'

apply plugin: 'java'
sourceCompatibility = 1.8

repositories {
  jcenter()
}

dependencies {
  compile group: 'org.apache.flink', name: 'flink-streaming-java_2.11', version: '1.8.0'
  compile group: 'org.apache.flink', name: 'flink-java', version: '1.8.0'
  compile group: 'org.apache.flink', name: 'flink-clients_2.11', version: '1.8.0'
  compile group: 'org.apache.flink', name: 'flink-avro', version: '1.8.0'
  compile group: 'org.apache.flink', name: 'flink-core', version: '1.8.0'
  compile group: 'org.apache.flink', name: 'flink-connector-kafka_2.12', version: '1.8.0'

  compile group: 'com.google.code.gson', name: 'gson', version: '2.8.5'
}

(workingCode.java)

import org.apache.flink.streaming.util.serialization.SimpleStringSchema;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;
import java.util.Properties;

public final class Main {

    public static FlinkKafkaConsumer<String> createStringConsumerForTopic(
            String topic, String kafkaAddress, String kafkaGroup ) {
        Properties props = new Properties();
        props.setProperty("bootstrap.servers", kafkaAddress);
        props.setProperty("group.id",kafkaGroup);
        FlinkKafkaConsumer011<String> consumer =
                new FlinkKafkaConsumer<>(topic, new SimpleStringSchema(),props);

        return consumer;
    }
}

此外,由于使用的是Kafka 1.1,因此与编译错误无关,因此,您最好使用Flink的Kafka连接器的最新版本,而不是用于Kafka 0.11的连接器。 FlinkKafkaConsumer(名称中没有版本号的类)是适用于Kafka 1.0.0及更高版本的连接器。