我想加入来自kafka制作人的两个流,但是加入不起作用。我使用AssignerWithPeriodicWatermark定义我的分配器,我尝试使用3分钟的窗口将两个流加入。但是我没有任何输出。我打印了两个流,以确保它们的事件在时间上足够接近。
object Job {
class Assigner extends AssignerWithPeriodicWatermarks[String] {
// 1 s in ms
val bound: Long = 1000
// the maximum observed timestamp
var maxTs: Long = Long.MinValue
override def getCurrentWatermark: Watermark = {
new Watermark(maxTs - bound)
}
override def extractTimestamp(r: String, previousTS: Long): Long = {
maxTs = Math.max(maxTs,previousTS)
previousTS
}
}
def main(args: Array[String]): Unit = {
val env = StreamExecutionEnvironment.getExecutionEnvironment//createLocalEnvironment()
env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime)
val properties = new Properties()
properties.setProperty("bootstrap.servers", "localhost:9093")
properties.setProperty("group.id", "test")
val consumerId = new FlinkKafkaConsumer[String]("topic_id", new SimpleStringSchema(), properties)
val streamId = env.addSource(consumerId).assignTimestampsAndWatermarks(new Assigner)
val streamIdParsed=streamId.map{s =>s.parseJson}.map{ value => (value.asJsObject.getFields("id")(0).toString(),value.asJsObject.getFields("m","w")) }
val consumerV = new FlinkKafkaConsumer[String]("topic_invoice", new SimpleStringSchema(), properties)
val streamV = env.addSource(consumerV).assignTimestampsAndWatermarks(new Assigner)
val streamVParsed = streamV.map{s =>s.parseJson}.map{ value => (value.asJsObject.getFields("id")(0).toString(),value.asJsObject.getFields("products")(0).toString().parseJson.asJsObject.getFields("id2", "id3")) }
streamIdParsed.join(streamVParsed).where(_._1).equalTo(_._1).window(SlidingEventTimeWindows.of(Time.seconds(60),Time.seconds(1))).apply { (e1, e2) => (e1._1,"test") }.print()
} }
答案 0 :(得分:0)
可能出问题的地方(您需要检查它们,因为您提供的信息太稀疏,无法缩小范围)
答案 1 :(得分:0)
问题是您尚未设置autoWatermarkInterval
,而您正在使用PeriodicAssigner
。您需要执行以下操作:
env.getConfig.setAutowatermarkInterval([someinterval])
这应该可以解决未生成水印的问题。