Kafka Connect-JDBC自定义Avro模式

时间:2019-12-16 21:03:13

标签: apache-kafka avro spring-kafka apache-kafka-connect confluent-schema-registry

我正在跟踪有关kafka connect的教程,我想知道是否有可能接收到某种类的消息。

教程:https://www.confluent.io/blog/simplest-useful-kafka-connect-data-pipeline-world-thereabouts-part-1/

类似于本教程中介绍的表,架构如下所示:

IF OBJECT_ID(N'dbo.Products', N'U') IS NOT NULL
    DROP TABLE dbo.Products

CREATE TABLE Products (
    ProductID int  NOT NULL PRIMARY KEY,
    ProductName nvarchar(100) null,
    UnitPrice decimal(18,2) not null,
    UnitsInStock int not null,
    UnitsOnOrder int null,
    DateIn datetime
);
GO

create or alter procedure InsertProduct(
    @ProductId int,
    @ProductName nvarchar(100),
    @UnitPrice decimal(18,2),
    @UnitsInStock int,
    @UnitsOnOrder int
)
as
begin
    begin try
        begin transaction

        insert  into Products(ProductID, ProductName, UnitPrice, UnitsInStock, UnitsOnOrder, DateIn)
            values(@ProductID, @ProductName, @UnitPrice, @UnitsInStock, @UnitsOnOrder, GETDATE())

        commit transaction 
    end try
    begin catch
        print N'value of XACT_STATE()' + convert(nvarchar(20),XACT_STATE());
        if XACT_STATE() = 1
        begin 
            print N'Rollback Necessary' 
            rollback transaction
        end;
        throw 51000, N'There is an error with the application',1    

    end catch

end
GO

exec InsertProduct @ProductID = 1, @ProductName = 'B', @UnitPrice = 10.0, @UnitsInStock = 2, @UnitsOnOrder = 1

select * from Products

-- Generating an error
exec InsertProduct @ProductID = 1, @ProductName = 'C', @UnitPrice = 10.0, @UnitsInStock = 2, @UnitsOnOrder = 1

select * from Products

基于avro格式,我使用maven生成了一个类。

然后我用我的类型定义了消费工厂:

{
   "namespace": "avro",
   "type": "record",
   "name": "Audit",
   "fields": [
      {"name": "c1", "type": "int"},
      {"name": "c2", "type": "string"},
      {"name": "create_ts", "type": "long"},
      {"name": "update_ts", "type": "long"}
   ]
}

和KafkaListener:

public ConsumerFactory<String, Audit> auditConsumerFactory() { ... )

但是最后我得到了这样的错误:

@KafkaListener(topics = "${kafka.mysql.topic}", containerFactory =   "mysqlKafkaListenerContainerFactory")
public void receive(Audit audit) {
     System.out.println(audit);
     this.latch.countDown();
}

编辑 使用Deserializer的ConsumerFactory:

2019-12-16 21:56:50.139 ERROR 31862 --- [ntainer#0-0-C-1] o.s.kafka.listener.LoggingErrorHandler   : Error while processing: null
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition mysql-audit-0 at offset 4. If needed, please seek past the record to continue consumption.
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 1
Caused by: org.apache.kafka.common.errors.SerializationException: Could not find class audit specified in writer's schema whilst finding reader's schema for a SpecificRecord.

Audit.avsc

    public ConsumerFactory<String, Audit> auditConsumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaConfiguration.getKafkaBootstrapAddress());
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "test");
        props.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG, "http://localhost:8081");
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class);
        props.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, true);
        return new DefaultKafkaConsumerFactory(props);
    }

我在Github上找到了我的问题的答案

1 个答案:

答案 0 :(得分:1)

我不知道这个问题是否还有其他线索,但是最终Confluence解决了这个问题。将这三行添加到JDBC连接器中

“ transforms”:“ AddNamespace”,“ transforms.AddNamespace.type”: “ org.apache.kafka.connect.transforms.SetSchemaMetadata $ Value”, “ transforms.AddNamespace.schema.name”:“ my.namespace.NameOfTheSchema”,

KAFKA-7883