Kafka Connect-无法刷新,在等待生产者刷新未完成消息时超时

时间:2019-04-04 17:06:22

标签: apache-kafka apache-kafka-connect

我试图在批量模式下使用具有以下属性的Kafka Connect JDBC源连接器。

connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
timestamp.column.name=timestamp
connection.password=XXXXX
validate.non.null=false
tasks.max=1
producer.buffer.memory=2097152
batch.size=1000
producer.enable.idempotence=true
offset.flush.timeout.ms=300000
table.types=TABLE,VIEW
table.whitelist=materials
offset.flush.interval.ms=5000
mode=bulk
topic.prefix=mysql-
connection.user=kafka_connect_user
poll.interval.ms=200000
connection.url=jdbc:mysql://<DBNAME>
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter=org.apache.kafka.connect.storage.StringConverter

我收到以下有关提交偏移量的错误,更改各种参数似乎影响不大。

[2019-04-04 12:42:14,886] INFO WorkerSourceTask{id=SapMaterialsConnector-0} flushing 4064 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask)
[2019-04-04 12:42:19,886] ERROR WorkerSourceTask{id=SapMaterialsConnector-0} Failed to flush, timed out while waiting for producer to flush outstanding 712 messages (org.apache.kafka.connect.runtime.WorkerSourceTask)

3 个答案:

答案 0 :(得分:0)

该错误表明缓冲了很多消息,并且在达到超时之前无法刷新这些消息。要解决此问题,您可以

  • 在您的Kafka Connect Worker Configs中增加offset.flush.timeout.ms配置参数
  • 或者您可以通过减少Kafka Connect Worker配置中的producer.buffer.memory来减少缓冲的数据量。当您收到大量邮件时,这将是最佳选择。

答案 1 :(得分:0)

启用security.protocol=SSL时,请确保为Connect worker和Connect生产者分别设置SSL参数。 为两者都提供SSL设置

  

Blockquote

# Authentication settings for Connect workers
ssl.keystore.location=/var/private/ssl/kafka.worker.keystore.jks
ssl.keystore.password=worker1234
ssl.key.password=worker1234

# Authentication settings for Connect producers used with source connectors
producer.ssl.keystore.location=/var/private/ssl/kafka.source.keystore.jks
producer.ssl.keystore.password=connector1234
producer.ssl.key.password=connector1234

请参阅https://docs.confluent.io/5.2.3/connect/security.html#separate-principals

答案 2 :(得分:0)

如果您尝试与融合云连接,则此错误可能是由于worker属性中缺少配置,请确保您添加了生产者和使用者配置。

consumer.ssl.endpoint.identification.algorithm=https
consumer.sasl.mechanism=PLAIN
consumer.request.timeout.ms=20000
consumer.retry.backoff.ms=500
consumer.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="API_KEY" password="SECRET";
consumer.security.protocol=SASL_SSL

producer.ssl.endpoint.identification.algorithm=https
producer.sasl.mechanism=PLAIN
producer.request.timeout.ms=20000
producer.retry.backoff.ms=500
producer.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="API_KEY" password="SECRET";
producer.security.protocol=SASL_SSL