测试使用SpecificAvroSerde的Kafka Processor API

时间:2018-07-19 17:45:27

标签: apache-kafka avro apache-kafka-streams confluent-schema-registry

我正在尝试为自定义流处理器编写单元测试,并且陷入序列化测试所需发送的消息的困境。我在此示例之后添加了kafka:https://kafka.apache.org/11/documentation/streams/developer-guide/testing.html。我对流中的自定义类(自动生成的avro类)使用SpecificAvroSerde,但是我无法在测试中使用MockSchemaRegistryClient()对其进行配置,我只能指向SR的URL。

    Serde<MyCustomObject> valueSerde = new SpecificAvroSerde<>();
    Map<String, String> valueSerdeConfig = new HashMap<>();
    valueSerdeConfig.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "fake");
    valueSerdeConfig.put(AbstractKafkaAvroSerDeConfig.AUTO_REGISTER_SCHEMAS, "true");
    valueSerde.configure(valueSerdeConfig, false);
    ConsumerRecordFactory<Long, MyCustomObject> recordFactory = new ConsumerRecordFactory<>(new LongSerializer(), valueSerde.serializer());

使用KafkaAvroSerializer,我可以像这样初始化它:

KafkaAvroSerializer serializer = new KafkaAvroSerializer(schemaRegistryClient);

但是ConsumerRecordFactory不会将KafkaAvroSerializer作为参数。

这是否有其他选择或我不知道的方式?

感谢您的帮助。

2 个答案:

答案 0 :(得分:1)

感谢提出的解决方案,但我今天确实设法找到了一个。 Serdes由一个序列化器和反序列化器组成:https://kafka.apache.org/11/javadoc/org/apache/kafka/common/serialization/Serdes.html#serdeFrom-org.apache.kafka.common.serialization.Serializer-org.apache.kafka.common.serialization.Deserializer-,因此我从KafkaAvroSerializer和KafkaAvroDeserializer构造了我的Serde,如下所示:

Serde serde = Serdes.serdeFrom(new KafkaAvroSerializer(client), new KafkaAvroDeserializer(client));

每个实现Serializer 的类都可以作为Serde的一部分。

答案 1 :(得分:0)

我正在为所有Avro定义生成Java模型:

#!/usr/bin/env bash

if [ ! -f avro-tools-1.8.2.jar ]; then
    wget http://tux.rainside.sk/apache/avro/avro-1.8.2/java/avro-tools-1.8.2.jar
    chmod +x avro-tools-1.8.2.jar
fi

java -jar avro-tools-1.8.2.jar compile schema ../avro/raw/* ../../java/

然后我只向模拟的Kafka集群中生成一些消息

public abstract class ViewPageEventGenerator {

    @NotNull
    public static KeyValue<List, HashMap<String, String>> getSimpleViewPages() {
        List<KeyValue<CustomerKey, ViewPage>> inputValues = new ArrayList<>();
        HashMap<String, String> requestExpectedValuePairs = new HashMap<>();

        inputValues = Arrays.asList(
                new KeyValue<>(
                        new CustomerKey(1912, "Alan Turing"),
                        new ViewPage("Alan Turing", false,
                                Double.parseDouble(String.valueOf(System.currentTimeMillis())),
                                "https://alan.turing/", "192.168.0.1", "Turing Machine", "Punch card"
                        )
                ),
                new KeyValue<>(
                        new CustomerKey(1912, "Alan Turing"),
                        new ViewPage("Alan Turing", false,
                                Double.parseDouble(String.valueOf(System.currentTimeMillis() + 100)),
                                "https://alan.turing/", "192.168.0.1", "Turing Machine", "Punch card"
                        )
                ),
                new KeyValue<>(
                        new CustomerKey(1912, "Alan Turing"),
                        new ViewPage("Alan Turing", false,
                                Double.parseDouble(String.valueOf(System.currentTimeMillis() + 200)),
                                "https://alan.turing/", "192.168.0.1", "Turing Machine", "Punch card"
                        )
                )
        );
        requestExpectedValuePairs.put(
                "{project_id: 1912}",
                "{\"success\":true,\"data\":{\"count\":3}}"
        );

        return new KeyValue<>(inputValues, requestExpectedValuePairs);
    }
}

就是这样。在拓扑中,我使用基于Avro定义的生成的Java模型(类)。