Kafka在将状态存储记录发送到主题时流了Timeout异常

时间:2020-09-27 15:00:24

标签: apache-kafka apache-kafka-streams confluent-platform

运行kafka流应用程序时出现以下错误。

KS应用程序加载大量(20GB)数据并将其保存在状态存储中。应用程序的初始阶段是将所有数据加载到状态存储并获得以下错误。

我正试图了解此错误将如何影响我以及应该使用什么可调参数来避免这些错误。

2020-09-27 13:41:02,063 1058071 [kafka-producer-network-thread | dp-App-ac0ad3b3-c9c5-4d0a-9d15-0122120f9d8b-StreamThread-2-producer] ERROR o.a.k.s.p.i.RecordCollectorImpl - stream-thread [dp-App-ac0ad3b3-c9c5-4d0a-9d15-0122120f9d8b-StreamThread-2] task [29_1] Error sending record to topic dp-App-Store-changelog due to Expiring 7 record(s) for dp-App-Store-changelog-1:300000 ms has passed since batch creation; No more records will be sent and no more offsets will be recorded for this task. Enable TRACE logging to view failed record key and value.
You can increase the producer configs `delivery.timeout.ms` and/or `retries` to avoid this error. Note that `retries` is set to infinite by default.
org.apache.kafka.common.errors.TimeoutException: Expiring 7 record(s) for dp-App-Store-changelog-1:300000 ms has passed since batch creation

任何建议

0 个答案:

没有答案