我有两个几乎相似的kafka应用程序。他们都听binlog来更改两个表。我的问题是其中之一工作正常,但是当尝试启动第二个时,我收到以下异常。
org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: {"type":"record","name":"Key","namespace":"mysql.company.payments","fields":[{"name":"id","type":"long"}],"connect.name":"mysql.company.payments.Key"} Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
错误所指向的架构文件具有以下内容:
{
"type": "record",
"name": "Key",
"namespace": "mysql.company.payments",
"fields": [
{
"name": "id",
"type": "long"
}
],
"connect.name": "mysql.company.payments.Key"
}
另一个正在运行的应用程序具有完全相同的avro文件,但表名(付款)已替换。 这两个应用程序都从同一服务器运行,并连接到同一Kafka群集。 我使用maven插件基于avro文件创建Java类。 Key.class类已成功创建。
这是我的应用程序中的两个重要类:
import com.company.util.Configs;
import error.PaymentSerializationException;
import io.confluent.kafka.serializers.KafkaAvroSerializerConfig;
import io.confluent.kafka.streams.serdes.avro.GenericAvroSerde;
import org.apache.kafka.streams.KafkaStreams;
import org.apache.kafka.streams.StreamsBuilder;
import org.apache.kafka.streams.StreamsConfig;
import payment.PaymentUpdateListener;
import java.util.Properties;
public class PaymentsMain {
static Properties properties;
public static void main(String[] args) {
StreamsBuilder builder = new StreamsBuilder();
properties = configProperties();
StreamsBuilder streamsBuilder = watchForPaymentUpdate(builder);
KafkaStreams kafkaStreams = new KafkaStreams(streamsBuilder.build(), properties);
kafkaStreams.start();
Runtime.getRuntime().addShutdownHook(new Thread(kafkaStreams::close));
}
private static StreamsBuilder watchForPaymentUpdate(StreamsBuilder builder){
PaymentUpdateListener paymentUpdateListener = new PaymentUpdateListener(builder);
paymentUpdateListener.start();
return builder;
}
private static Properties configProperties(){
Properties streamProperties = new Properties();
streamProperties.put(KafkaAvroSerializerConfig.SCHEMA_REGISTRY_URL_CONFIG, Configs.getConfig("schemaRegistryUrl"));
streamProperties.put(StreamsConfig.APPLICATION_ID_CONFIG, "payment-kafka");
streamProperties.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, Configs.getConfig("bootstrapServerUrl"));
streamProperties.put(StreamsConfig.COMMIT_INTERVAL_MS_CONFIG, 1000);
streamProperties.put(StreamsConfig.STATE_DIR_CONFIG, "/tmp/state_dir");
streamProperties.put(StreamsConfig.NUM_STREAM_THREADS_CONFIG, "3");
streamProperties.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
streamProperties.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
streamProperties.put(StreamsConfig.METRICS_RECORDING_LEVEL_CONFIG, "DEBUG");
streamProperties.put(StreamsConfig.DEFAULT_PRODUCTION_EXCEPTION_HANDLER_CLASS_CONFIG,
PaymentSerializationException.class);
return streamProperties;
}
}
import org.apache.kafka.streams.StreamsBuilder;
import org.apache.kafka.streams.kstream.Consumed;
public class PaymentUpdateListener {
private StreamsBuilder builder;
public PaymentUpdateListener(StreamsBuilder builder) {
this.builder = builder;
}
public void start(){
builder.stream("mysql.company.payments",
Consumed.with(PaymentSerde.getGenericKeySerde(), PaymentSerde.getEnvelopeSerde()))
.to("kafka-consumer.payment");
}
}