如何禁用KafkaUtils在Spark作业中覆盖我的Kafka主题

时间:2018-10-23 18:45:59

标签: scala apache-spark apache-kafka spark-streaming kafka-consumer-api

我的火花工作的一部分:

 val kafkaParams = Map[String, Object](
   "key.deserializer" -> classOf[StringDeserializer],
   "value.deserializer" -> classOf[StringDeserializer],
   "group.id" -> "mytopic",
 )

 try {
   val inputStream = KafkaUtils.createDirectStream(ssc,PreferConsistent, Subscribe[String, String](Array(inputTopic), kafkaParams))
   val processedStream = inputStream.map(record => record.value)
   processedStream.print()
   ssc.start
   ssc.awaitTermination
 } finally {
   ...
 }

我得到了以下日志:

2018-10-23 14:35:26 WARN  KafkaUtils:66 - overriding executor group.id to spark-executor-mytopic

如何禁用主题替代? 欢迎任何意见。谢谢

0 个答案:

没有答案