使用Spring Cloud Stream和Kafka进行重复的消息处理

时间:2017-03-01 11:11:04

标签: duplicates apache-kafka spring-cloud-stream

我正在使用带有Kafka活页夹的Spring Cloud Stream。它工作得很好,但客户端收到重复的消息。已经尝试了所有Kafka Consumer Properties而没有结果。

在我的应用程序示例中检查2个类 - AggregateApplication和EventFilterApplication。如果我运行EventFilterApplication - 只有1条消息,如果是AggregateApplication - 2条相同的消息。

以下是我的代码:

1)聚合器

import com.example.EventFilterApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.stream.aggregate.AggregateApplicationBuilder;

@SpringBootApplication
public class AggregateApplication {
    public static void main(String[] args) {
        new AggregateApplicationBuilder(new Object[]{EventFilterApplication.class}, args)
            .from(EventFilterApplication.class)
            .run(args);
    }
}

2)EventFilterApplication

@SpringBootApplication
@EnableBinding(EventFilterApplication.LiveProcessor.class)
public class EventFilterApplication {

    @Autowired
    LiveProcessor source;

    @StreamListener(LiveProcessor.INPUT)
    public void handle(byte[] event) {
        try {

            System.out.println(new Date().getTime() + ": event was processed:" + Arrays.toString(event));

        } catch (Exception e) {
            System.out.println(String.format("Error={%s} on processing message=%s", e.getMessage(), Arrays.toString(event)));
        }
    }
    public static void main(String[] args) {
        SpringApplication.run(EventFilterApplication.class, args);
    }

    interface LiveProcessor extends Source {

        String INPUT = "liveSource";

        @Input(INPUT)
        SubscribableChannel input();
    }
}

3)application.yml

spring:
cloud:
    stream:
        kafka:
          binder:
              brokers: kafka-broker.example.com:9092
              defaultBrokerPort: 9092
              defaultZkPort: 2181
              zkNodes: kafka-zookeeper.example.com
        type: kafka
        bindings:
            liveSource:
                binder: kafka
                consumer:
                    headerMode: raw
                    autoCommitOffset: true
                destination: topic_example_name

4)build.gradle

buildscript {
    ext { springBootVersion = '1.4.2.RELEASE' }
    repositories {
        jcenter()
        maven { url 'http://repo.spring.io/plugins-release' }
    }
    dependencies {
        classpath("org.springframework.build.gradle:propdeps-plugin:0.0.7")
        classpath("org.springframework.boot:spring-boot-gradle-plugin:$springBootVersion")
        classpath("io.spring.gradle:dependency-management-plugin:0.5.2.RELEASE")
    }
}

ext['logstashLogbackEncoderV'] = '4.8'
ext['springCloudV'] = 'Camden.SR1'
ext['springCloudStreamV'] = 'Brooklyn.SR2'
ext['springIntegrationKafkaV'] = '1.3.1.RELEASE'

subprojects {
    apply plugin: 'java'
    apply plugin: 'propdeps'
    apply plugin: 'propdeps-idea'
    apply plugin: "io.spring.dependency-management"

    sourceCompatibility = 1.8

    dependencyManagement {
        imports {
            mavenBom "org.springframework.cloud:spring-cloud-dependencies:Camden.SR1"
            mavenBom "org.springframework.cloud:spring-cloud-stream-dependencies:Brooklyn.SR2"
            mavenBom "org.springframework.cloud.stream.app:spring-cloud-stream-app-dependencies:1.0.4.RELEASE"
        }
    }

    dependencies {
        compile("org.springframework.boot:spring-boot-starter-web:$springBootVersion") {
            exclude module: "spring-boot-starter-tomcat"
            exclude group: 'log4j'
        }

        compile("org.springframework.cloud:spring-cloud-starter-stream-kafka")

        compile("org.springframework.integration:spring-integration-kafka:$springIntegrationKafkaV") {
            exclude group: "org.slf4j"
        }

        compile("org.springframework.cloud:spring-cloud-stream:")

        compile("org.springframework.cloud:spring-cloud-starter-sleuth")

        compile("net.logstash.logback:logstash-logback-encoder:${logstashLogbackEncoderV}")

        testCompile("org.springframework.boot:spring-boot-starter-test:$springBootVersion") {
            exclude group: "org.slf4j"
        }
    }
}

1 个答案:

答案 0 :(得分:1)

重复是由EventFilterApplication作为父根:

引起的
public class AggregateApplication {
    public static void main(String[] args) {
        new AggregateApplicationBuilder(new Object[]{EventFilterApplication.class}, args)
            .from(EventFilterApplication.class)
            .run(args);
    }
}

这很可能会创建两个订阅。您可以简单地执行以下操作,而不是以root身份添加EventFilterApplication

public class AggregateApplication {
    public static void main(String[] args) {
        new AggregateApplicationBuilder(args)
            .from(EventFilterApplication.class)
            // rest of the pipeline
            .run(args);
    }
}

如果您不需要创建聚合,这应该足够了:

public static void main(String[] args) {
        SpringApplication.run(EventFilterApplication.class, args);
}

编辑:增加了一个额外的例子,并澄清了答案。