在失败时正常重启Reactive-Kafka Consumer Stream

时间:2018-05-22 10:56:45

标签: scala apache-kafka akka akka-stream reactive-kafka

问题 当我重新启动/完成/停止流时,旧的消费者不会死/关闭:

[INFO ] a.a.RepointableActorRef -
  Message [akka.kafka.KafkaConsumerActor$Internal$Stop$] 
  from Actor[akka://ufo-sightings/deadLetters]
  to Actor[akka://ufo-sightings/system/kafka-consumer-1#1896610594]
  was not delivered. [1] dead letters encountered.

描述 我正在构建一个服务,它接收来自Kafka主题的消息,并通过HTTP请求将消息发送到外部服务。

  1. 可能会破坏与外部服务的连接,我的服务需要重试该请求。

  2. 此外,如果Stream中存在错误,则需要重新启动整个流。

  3. 最后,有时我不需要流及其相应的Kafka消费者,我想关闭整个流

  4. 所以我有一个Stream:

    Consumer.committableSource(customizedSettings, subscriptions)
      .flatMapConcat(sourceFunction)
      .toMat(Sink.ignore)
      .run
    

    Http请求在sourceFunction

    中发送

    我在新文档

    中遵循了新的Kafka Consumer Restart说明
      RestartSource.withBackoff(
          minBackoff = 20.seconds,
          maxBackoff = 5.minutes,
          randomFactor = 0.2 ) { () =>
              Consumer.committableSource(customizedSettings, subscriptions)
                .watchTermination() {
                    case (consumerControl, streamComplete) =>
                      logger.info(s" Started Watching Kafka consumer id = ${consumer.id} termination: is shutdown: ${consumerControl.isShutdown}, is f completed: ${streamComplete.isCompleted}")
                      consumerControl.isShutdown.map(_ => logger.info(s"Shutdown of consumer finally happened id = ${consumer.id} at ${DateTime.now}"))
                      streamComplete
                        .flatMap { _ =>
                          consumerControl.shutdown().map(_ -> logger.info(s"3.consumer id = ${consumer.id} SHUTDOWN at ${DateTime.now} GRACEFULLY:CLOSED FROM UPSTREAM"))
                        }
                        .recoverWith {
                          case _ =>
                            consumerControl.shutdown().map(_ -> logger.info(s"3.consumer id = ${consumer.id} SHUTDOWN at ${DateTime.now} ERROR:CLOSED FROM UPSTREAM"))
                        }
                 }
                .flatMapConcat(sourceFunction)
          }
          .viaMat(KillSwitches.single)(Keep.right)
          .toMat(Sink.ignore)(Keep.left)
          .run
    

    有一个issue已打开,在复杂的Akka流中讨论了这个非终止消费者,但还没有解决方案。

    是否存在强制Kafka Consumer终止的解决方法

1 个答案:

答案 0 :(得分:1)

如何将消费者包装在Actor中并注册KillSwitch,请参阅:https://doc.akka.io/docs/akka/2.5/stream/stream-dynamic.html#dynamic-stream-handling

然后在Actor postStop方法中,您可以终止流。 通过将Actor包装在BackoffSupervisor中,您可以获得指数退避。

示例演员:https://github.com/tradecloud/kafka-akka-extension/blob/master/src/main/scala/nl/tradecloud/kafka/KafkaSubscriberActor.scala#L27