kafka-apache flink执行log4j错误

时间:2017-07-27 14:04:42

标签: apache-kafka apache-flink flink-streaming

我正在尝试使用Kafka inegration运行一个简单的Apache Flink脚本,但我仍然遇到执行问题。 该脚本应该读取来自kafka生成器的消息,详细说明它们,然后再次发送回另一个主题,即处理结果。 我从这里得到这个例子: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Simple-Flink-Kafka-Test-td4828.html

我遇到的错误是:

Exception in thread "main" java.lang.NoSuchFieldError:ALL 
at org.apache.flink.streaming.api.graph.StreamingJobGraphGenera‌tor.createJobGraph(S‌​treamingJobGraphGene‌​rator.java:86) 
at org.apache.flink.streaming.api.graph.StreamGraph.getJobGraph‌​(StreamGraph.java:42‌​9) 
at org.apache.flink.streaming.api.environment.LocalStreamEnviro‌nment.execute(LocalS‌​treamEnvironment.jav‌​a:46) 

at org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalS treamEnvironment.jav a:33)

这是我的代码:

public class App {
      public static void main(String[] args) throws Exception {
            StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); 
            Properties properties = new Properties(); 
            properties.setProperty("bootstrap.servers", "localhost:9092"); 

            //properties.setProperty("zookeeper.connect", "localhost:2181"); 
            properties.setProperty("group.id", "javaflink"); 

            DataStream<String> messageStream = env.addSource(new FlinkKafkaConsumer010<String>("test", new SimpleStringSchema(), properties));
            System.out.println("Step D"); 
            messageStream.map(new MapFunction<String, String>(){ 

                    public String map(String value) throws Exception { 
                            // TODO Auto-generated method stub 
                            return "Blablabla " +  value; 
                    } 
            }).addSink(new FlinkKafkaProducer010("localhost:9092", "demo2", new SimpleStringSchema())); 
            env.execute(); 
      }
}

这些是pom.xml依赖项:

<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-core</artifactId>
    <version>1.3.1</version>
</dependency>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-java_2.11</artifactId>
    <version>0.10.2</version>
</dependency>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-clients_2.11</artifactId>
    <version>1.3.1</version>
</dependency>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-streaming-core</artifactId>
    <version>0.9.1</version>
</dependency>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-streaming-java_2.11</artifactId>
    <version>1.3.1</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-connector-kafka-0.10_2.11</artifactId>
    <version>1.3.1</version>
</dependency>

什么可能导致这种错误?

由于 卢卡

1 个答案:

答案 0 :(得分:0)

问题很可能是由pom.xml中定义的不同Flink版本混合引起的。为了运行这个程序,它应该足以包含以下依赖项:

<!-- Streaming API -->
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-streaming-java_2.11</artifactId>
    <version>1.3.1</version>
</dependency>

<!-- In order to execute the program from within your IDE -->
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-clients_2.11</artifactId>
    <version>1.3.1</version>
</dependency>

<!-- Kafka connector dependency -->
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-connector-kafka-0.10_2.11</artifactId>
    <version>1.3.1</version>
</dependency>