对于用户在我们平台上执行的每个事件,我们在唯一主题中都有一条kafka消息。每个事件/ kafka消息都有一个公共字段userId。我们现在想从那个主题知道我们每小时有多少独特用户。因此,我们对事件类型和用户的个人计数不感兴趣。我们只想知道每小时有多少独特用户活跃。 实现这一目标的最简单方法是什么?我目前的想法似乎不是很简单,请参阅这里的伪代码:
stream
.selectKey() // userId
.groupByKey() // group by userid, results in a KGroupedStream[UserId, Value]
.aggregate( // initializer, merger und accumulator simply deliver a constant value, the message is now just a tick for that userId key
TimeWindows.of(3600000)
) // result of aggregate is KTable[Windowed[UserId], Const]
.toStream // convert in stream to be able to map key in next step
.map() // map key only (Windowed[Userid]) to key = startMs of window to and value Userid
.groupByKey() // grouping by startMs of windows, which was selected as key before
.count() // results in a KTable from startMs of window to counts of users (== unique userIds)
有更简单的方法吗?我可能忽视了一些事情。
答案 0 :(得分:2)
你可以做两件事:
selectKey()
和groupByKey()
合并到groupBy()
toStream().map()
步骤,但您可以直接在第一个KTable
上重新组合新密钥这样的事情:
stream.groupBy(/* put a KeyValueMapper that return the grouping key */)
.aggregate(... TimeWindow.of(TimeUnit.HOURS.toMillis(1))
.groupBy(/* put a KeyValueMapper that return the new grouping key */)
.count()