加入使用Avro Schemas定义的两个Kafka Streams时如何编写ValueJoiner?

时间:2018-05-17 23:27:14

标签: apache-kafka avro apache-kafka-streams

我正在构建一个电子商务应用程序,我目前正处理两个数据源:订单执行和销售损坏。由于各种原因,破产销售将是无效执行。破损的销售将具有与订单相同的订单编号,因此连接在订单参考号#和订单项#。

目前,我有两个主题 - ... 25 b't' 1 1 0x74 116 26 b'\n' 1 2 0xa 10 27 b'\xe3' 1 4 0xe3 227 28 b'\x83' 1 4 0x83 131 29 b'\xbb' 1 4 0xbb 187 30 b'\xe3' 1 4 0xe3 227 31 b'\x83' 1 4 0x83 131 32 b'\xbb' 1 4 0xbb 187 33 b'\xe3' 1 4 0xe3 227 34 b'\x83' 1 4 0x83 131 35 b'\xbb' 1 4 0xbb 187 36 b"'" 1 1 0x27 39 Byte count 36 b"#Repost@motivated.mindset\n\xe3\x83\xbb\xe3\x83\xbb\xe3\x83\xbb'" code count 64 #Repost@motivated.mindset ・・・' utf-8 count 30 orders。两者都使用Avro Schema定义,并使用SpecificRecord构建。关键是broken

OrderReferenceNumber的字段:orders

OrderReferenceNumber, Timestamp, OrderLine, ItemNumber, Quantity的字段:broken

通过运行

生成相应的Java类
OrderReferenceNumber, OrderLine, Timestamp

我需要使用mvn clean package 左键加入orders,并在输出中包含以下字段:broken

这是我的代码:

OrderReferenceNumber, Timestamp, BrokenSaleTimestamp, OrderLine, ItemNumber, Quantity

当我运行它时,我得到以下maven编译错误:

public static void main(String[] args) {
    // Declare variables
    final Map<String, String> avroSerdeConfig = Collections.singletonMap(KafkaAvroSerializerConfig.SCHEMA_REGISTRY_URL_CONFIG, "http://localhost:8081");

    // Add Kafka Streams Properties
    Properties streamsProperties = new Properties();
    streamsProperties.put(StreamsConfig.APPLICATION_ID_CONFIG, "orderProcessor");
    streamsProperties.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
    streamsProperties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
    streamsProperties.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
    streamsProperties.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, SpecificAvroSerde.class);
    streamsProperties.put(KafkaAvroSerializerConfig.SCHEMA_REGISTRY_URL_CONFIG, "localhost:8081");

    // Specify Kafka Topic Names
    String orderTopic = "com.ecomapp.input.OrderExecuted";
    String brokenTopic = "com.ecomapp.input.BrokenSale";

    // Specify Serializer-Deserializer or Serdes for each Message Type
    Serdes.StringSerde stringSerde = new Serdes.StringSerde();
    Serdes.LongSerde longSerde = new Serdes.LongSerde();
    // For the Order Executed Message
    SpecificAvroSerde<OrderExecuted> ordersSpecificAvroSerde = new SpecificAvroSerde<OrderExecuted>();
    ordersSpecificAvroSerde.configure(avroSerdeConfig, false);
    // For the Broken Sale Message
    SpecificAvroSerde<BrokenSale> brokenSpecificAvroSerde = new SpecificAvroSerde<BrokenSale>();
    brokenSpecificAvroSerde.configure(avroSerdeConfig, false);


    StreamsBuilder streamBuilder = new StreamsBuilder();

    KStream<String, OrderExecuted> orders = streamBuilder
            .stream(orderTopic, Consumed.with(stringSerde, ordersSpecificAvroSerde))
            .selectKey((key, orderExec) -> orderExec.getMatchNumber().toString());
    KStream<String, BrokenSale> broken = streamBuilder
            .stream(brokenTopic, Consumed.with(stringSerde, brokenSpecificAvroSerde))
            .selectKey((key, brokenS) -> brokenS.getMatchNumber().toString());

    KStream<String, JoinOrdersExecutedNonBroken> joinOrdersNonBroken = orders
        .leftJoin(broken,
                (orderExec, brokenS) -> JoinOrdersExecutedNonBroken.newBuilder()
                        .setOrderReferenceNumber((Long) orderExec.get("OrderReferenceNumber"))
                        .setTimestamp((Long) orderExec.get("Timestamp"))
                        .setBrokenSaleTimestamp((Long) brokenS.get("Timestamp"))
                        .setOrderLine((Long) orderExec.get("OrderLine"))
                        .setItemNumber((String) orderExec.get("ItemNumber"))
                        .setQuantity((Long) orderExec.get("Quantity"))
                        .build(),
                JoinWindows.of(TimeUnit.MILLISECONDS.toMillis(1))
                Joined.with(stringSerde, ordersSpecificAvroSerde, brokenSpecificAvroSerde))
        .peek((key, value) -> System.out.println("key = " + key + ", value = " + value));


    KafkaStreams orderStreams = new KafkaStreams(streamBuilder.build(), streamsProperties);
    orderStreams.start();

    // print the topology
    System.out.println(orderStreams.localThreadsMetadata());

    // shutdown hook to correctly close the streams application
    Runtime.getRuntime().addShutdownHook(new Thread(orderStreams::close));

}

问题在于定义我的[ERROR] /Tech/Projects/jCom/src/main/java/com/ecomapp/kafka/orderProcessor.java:[96,26] incompatible types: cannot infer type-variable(s) VO,VR,K,V,VO (argument mismatch; org.apache.kafka.streams.kstream.Joined<K,V,com.ecomapp.input.BrokenSale> cannot be converted to org.apache.kafka.streams.kstream.Joined<java.lang.String,com.ecomapp.OrderExecuted,com.ecomapp.input.BrokenSale>) 。当涉及Avro架构时,Confluent文档不清楚如何执行此操作(我也找不到示例)。定义这个的正确方法是什么?

1 个答案:

答案 0 :(得分:0)

不确定为什么Java无法解析该类型。

尝试:

Joined.<String,OrderExecuted,BrokenSale>with(stringSerde, ordersSpecificAvroSerde, brokenSpecificAvroSerde))

明确指定类型。