Spring Cloud Stream Kafka活页夹KTable无法正常工作

时间:2018-07-09 08:15:19

标签: apache-kafka apache-kafka-streams spring-cloud-stream

我正在尝试通过SCSt通道构建然后获取KTable。但这是行不通的。输入KTable没有数据,但是如果我尝试查看KSTream Aggregation(toStream()),我可以看到一些数据。 我知道,KTable是不可查询的,它也不是可查询的名称。

班级:

@Slf4j
@EnableBinding({LimitBinding.class})
public class CommonWorker {

  @Value("${app.dataflow.out-destination}")
  private String customerOut;

  private LimitCustomersHelper custHelper = new LimitCustomersHelper();

  @StreamListener(CUSTOMER_IN)
  public void groupCustomersByLimitIdKTable(KStream<Key, Envelope> input) {
   input
        .filter(custHelper::afterIsNotNull)
        .groupBy(custHelper::groupBy)
        .aggregate(
            custHelper::create,
            custHelper::aggregate,
            custHelper.materialized(customerOut)
        );
  }

  @StreamListener
  public void checkCustomerasTable(@Input(CUSTOMER_OUT) KTable<StringWrapper,LimitCustomers> customers){
    customers.toStream().peek(StreamUtils::peek);
  }

绑定:

public interface LimitBinding {

  String CUSTOMER_IN = "customer-in";
  String CUSTOMER_OUT = "customer-out";


  @Input(CUSTOMER_IN)
  KStream<Key, Envelope> customerInput();

  @Input(CUSTOMER_OUT)
  KTable<StringWrapper, LimitCustomers> customersStream();

}

application.yml:

server.port: 0
spring:
  application.name: connect-producer
  cloud.stream:
    kafka.streams.binder.configuration:
      schema:
        registry.url: http://192.168.99.100:8081
      default:
        key.serde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde
        value.serde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde
    schema.avro.dynamic-schema-generation-enabled: true
    bindings:
      customer-in:
        contentType: application/*+avro
        destination: ${app.dataflow.in-destination}
        group: ${app.dataflow.in-destination}
      customer-out:
        consumer.materializedAs: ${app.dataflow.out-destination}

app.dataflow:
  in-destination: customer_link
  out-destination: customer_link.next


spring.cloud.stream.kafka.streams.binder:
  brokers: 192.168.99.100:9092
  configuration.application.server: 192.168.99.100:9092

1 个答案:

答案 0 :(得分:0)

通过添加主题名称模拟表名称解决了该问题