我正在尝试将生产者连接到消费者的一种变体,这种特殊情况是有时我需要每条消息多产生1条消息(例如,一条消息发送到输出主题,一条消息发送到另一个主题),而对此保证。
我当时正在考虑做mapConcat并输出多个ProducerRecord对象,我担心在边缘情况下松散的保证,即第一个消息足以在该偏移量上进行提交,从而导致第二个消息的潜在损失。而且似乎您不能只做.flatmap,因为您将进入图API,因为它变得更加混乱,因为一旦合并到提交流中,您将很难确保不再只是忽略重复的偏移量,这将变得更加困难
Consumer.committableSource(consumerSettings, Subscriptions.topics(inputTopic))
.map(msg => (msg, addLineage(msg.record.value())))
.mapConcat(input =>
if (math.random > 0.25)
List(ProducerMessage.Message(
new ProducerRecord[Array[Byte], Array[Byte]](outputTopic, input._1.record.key(), input._2),
input._1.committableOffset
))
else List(ProducerMessage.Message(
new ProducerRecord[Array[Byte], Array[Byte]](outputTopic, input._1.record.key(), input._2),
input._1.committableOffset
),ProducerMessage.Message(
new ProducerRecord[Array[Byte], Array[Byte]](outputTopic2, input._1.record.key(), input._2),
input._1.committableOffset
))
)
.via(Producer.flow(producerSettings))
.map(_.message.passThrough)
.batch(max = 20, first => CommittableOffsetBatch.empty.updated(first)) {
(batch, elem) => batch.updated(elem)
}
.mapAsync(parallelism = 3)(_.commitScaladsl())
.runWith(Sink.ignore)
原始的1对1文档在这里:https://doc.akka.io/docs/akka-stream-kafka/current/consumer.html#connecting-producer-and-consumer
有人想到/解决了这个问题吗?
答案 0 :(得分:2)
Alpakka Kafka连接器最近推出了flexiFlow
,它支持您的用例:Let one stream element produce multiple messages to Kafka