继续将数据写入主题后,Kafka服务器关闭

时间:2018-06-29 18:47:37

标签: scala apache-kafka kafka-producer-api

当我使用Spark Direct Streaming 2.1将数据写入kafka主题时,一段时间后kafka关机。

val props = new HashMap[String, Object]()     
 props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "ip:9092")   
 props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
   "org.apache.kafka.common.serialization.StringSerializer")       
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
   "org.apache.kafka.common.serialization.StringSerializer")
    val producer = new KafkaProducer[String, String](props)

   val message = new ProducerRecord[String, String](topic, partition,compression key,message)
   producer.send(message)

0 个答案:

没有答案