尝试将元组写入Flink Kafka接收器

时间:2018-10-05 18:26:40

标签: apache-kafka apache-flink

我正在尝试编写一个既可读取又可写入Kafka的流式应用程序。我目前有这个,但是我必须toString我的元组类。

object StreamingJob {
  def main(args: Array[String]) {
    // set up the streaming execution environment
    val env = StreamExecutionEnvironment.getExecutionEnvironment

    val properties = new Properties()
    properties.setProperty("bootstrap.servers", "localhost:9092")
    properties.setProperty("zookeeper.connect", "localhost:2181")
    properties.setProperty("group.id", "test")

    val consumer = env.addSource(new FlinkKafkaConsumer08[String]("topic", new SimpleStringSchema(), properties))

    val counts = consumer.flatMap { _.toLowerCase.split("\\W+") filter { _.nonEmpty } }
      .map { (_, 1) }
      .keyBy(0)
      .timeWindow(Time.seconds(5))
      .sum(1)

    val producer = new FlinkKafkaProducer08[String](
      "localhost:9092",
      "my-topic",
      new SimpleStringSchema())

    counts.map(_.toString()).addSink(producer)

    env.execute("Window Stream WordCount")
    env.execute("Flink Streaming Scala API Skeleton")
  }
}

与之最接近的工作如下,但是FlinkKafkaProducer08拒绝接受类型参数作为构造函数的一部分。

val producer = new FlinkKafkaProducer08[(String, Int)](
  "localhost:9092",
  "my-topic",
  new TypeSerializerOutputFormat[(String, Int)])

counts.addSink(producer)

我想知道是否可以将元组直接写到我的Kafka接收器中。

1 个答案:

答案 0 :(得分:0)

您需要一个类似这样的序列化您的元组的类:

private class SerSchema extends SerializationSchema[Tuple2[String, Int]] {
  override def serialize(tuple2: Tuple2[String, Int]): Array[Byte] = ...
}