我想在foreachRDD方法中检索RDD的每条记录上的每个kafka偏移量。我的主题中有一个分区,所以我的RDD也得到一个分区。我基本上尝试过这样的事情:
dStream.foreachRDD { rdd =>
if (!rdd.isEmpty) {
//get offset first value of the offset
val firstOffset = rdd.asInstanceOf[HasOffsetRanges].offsetRanges(0).fromOffset
val rddWithOffset = rdd.map(_.value)
.zipWithIndex()
.map{ case (v,i) => (v,i + firstOffset)}
}
}
例如,在我的生产者中,我使用循环发送消息,并将索引放在名为position的列中,如下所示:
+------+-----+--------+
| name| age|position|
+------+-----+--------+
|johnny| 26| 1|
| chloe| 42| 2|
| brian| 19| 3|
| eliot| 35| 4|
+------+-----+--------+
不幸地,我注意到当我在消费者中添加偏移列时,订单不会被维护:
+------+-----+--------+------+
| name| age|position|offset|
+------+-----+--------+------+
|johnny| 26| 1| 1|
| chloe| 42| 2| 3|
| brian| 19| 3| 4|
| eliot| 35| 4| 2|
+------+-----+--------+------+
似乎我通过这个过程放松了订单。
你有什么主意吗?感谢
顺便说一下,我的Java制作人看起来像这样:
KafkaRestProducer<String, Object> producer = new KafkaRestProducer<>(props);
ArrayList<String> names = new ArrayList<String>()
names.add("johnny")
names.add("chloe")
names.add("brian")
names.add("eliot")
ArrayList<Integer> ages = ArrayList<Integer>()
names.add(26)
names.add(42)
names.add(19)
names.add(35)
for (int i = 0; i < 3; ++i) {
String name = names(i)
Int age = ages(i)
Person person = Person
.newBuilder()
.setName(name)
.setAge(age)
.setPosition(i)
.build();
ProducerRecord<String, Object> record = new ProducerRecord<>("/apps/PERSON/streams:myTopic", name, person);
producer.send(record, null);
System.out.println(i);
}
答案 0 :(得分:1)
我的英语很差。我使用这段代码:
val Array(brokers, topic, groupId) = args
val kafkaParams = Map[String, String]("metadata.broker.list" -> brokers, "group.id" -> groupId)
val topicPartition = Map[TopicAndPartition, Long](TopicAndPartition(topic, 0) -> 1.toLong)
val messageHandler = (mmd: MessageAndMetadata[String, String]) => (mmd.offset, mmd.message)
val kafkaStream = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder, (Long, String)](
ssc, kafkaParams, topicPartition, messageHandler)
kafkaStream.foreachRDD(rdd => rdd.foreach(println))
输出: (offset,lineOfMessage) ...