kafka-avro-console-producer快速启动失败

时间:2016-09-21 20:13:25

标签: confluent

我在confluent-3.0.0中使用kafka-avro-console-producer,执行以下操作时出错:

./bin/kafka-avro-console-producer --broker-list localhost:9092 --topic test1234 --property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}'
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/tonydao/dev/bin/confluent-3.0.0/share/java/kafka-serde-tools/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/tonydao/dev/bin/confluent-3.0.0/share/java/confluent-common/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/tonydao/dev/bin/confluent-3.0.0/share/java/schema-registry/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
{"f1":"value1"}
{"f1":"value2"}

org.apache.kafka.common.errors.SerializationException: Error deserializing json  to Avro of schema {"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}
Caused by: java.io.EOFException
    at org.apache.avro.io.JsonDecoder.advance(JsonDecoder.java:138)
    at org.apache.avro.io.JsonDecoder.readString(JsonDecoder.java:219)
    at org.apache.avro.io.JsonDecoder.readString(JsonDecoder.java:214)
    at org.apache.avro.io.ResolvingDecoder.readString(ResolvingDecoder.java:201)
    at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:363)
    at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:355)
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:157)
    at org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:193)
    at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:183)
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:151)
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:142)
    at io.confluent.kafka.formatter.AvroMessageReader.jsonToAvro(AvroMessageReader.java:189)
    at io.confluent.kafka.formatter.AvroMessageReader.readMessage(AvroMessageReader.java:157)
    at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:55)
    at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)

3 个答案:

答案 0 :(得分:2)

  1. 确保您仅从汇合的kafka软件包运行所有必需的服务(zookeeper kafka serverschema registry)。
  2. 您可能之前在同一台服务器上使用过某些其他版本的kafka,可能需要清理日志目录(/tmp/kafka是默认目录)
  3. 请确保您没有提供数据Enter,因为它被视为空,并导致异常。
  4. 尝试使用全新主题

答案 1 :(得分:1)

首先,在运行此命令时,请按如下所示在双引号中添加转义字符,然后按一次Enter:

./bin/kafka-avro-console-producer --broker-list localhost:9092 --topic test1234 --property value.schema='{\"type\":\"record\",\"name\":\"myrecord\",\"fields\":[{\"name\":\"f1\",\"type\":\"string\"}]}'

运行此命令并按一次Enter后,请输入不带转义字符的json对象作为双引号,如下所示:

{"f1":"value1"}

在每个json对象之后,仅按Enter键一次,如果按两次,则将空值作为下一个json对象,因此会收到该错误。输入json对象后没有得到确认,但是它已经被发送到kafka,等待一两分钟,然后检查kafka的控制中心,消息应该在给定的主题中。只需按命令c并退出生产者控制台即可。

答案 2 :(得分:0)

在生产者消息中输入NULL值时会发生这种情况。看起来它无法从json-> avro转换为NULL。只需输入json并按Enter,然后在完成后按ctrl + d。

我注意到我的Avro生产者没有一个'>'字符来指定它正在接受消息。所以我按下Enter时,从脚本中得到了一些响应。