标签: apache-spark apache-spark-sql spark-streaming
考虑在spark sql中进行此类转换:
BigInt
如何编写SQL语句来实现相同的目标? 然后,我可以使用df.withWatermark("timestamp", "1 hour") .groupBy(window($"timestamp", "10 seconds"), $"partition") .count() 来做到这一点。
df.withWatermark("timestamp", "1 hour") .groupBy(window($"timestamp", "10 seconds"), $"partition") .count()