spark kafka producer serializable

时间:2016-11-09 06:18:49

标签: scala apache-spark kafka-producer-api

我想出了例外:

  

错误yarn.ApplicationMaster:用户类抛出异常:   org.apache.spark.SparkException:任务不可序列化   org.apache.spark.SparkException:任务不可序列化   org.apache.spark.util.ClosureCleaner $ .ensureSerializable(ClosureCleaner.scala:304)     在   org.apache.spark.util.ClosureCleaner $ .ORG $阿帕奇$火花$ UTIL $ ClosureCleaner $$干净(ClosureCleaner.scala:294)     在   org.apache.spark.util.ClosureCleaner $清洁机壳(ClosureCleaner.scala:122)     在org.apache.spark.SparkContext.clean(SparkContext.scala:2032)at   org.apache.spark.rdd.RDD $$ anonfun $ foreach $ 1.apply(RDD.scala:889)at at   org.apache.spark.rdd.RDD $$ anonfun $ foreach $ 1.apply(RDD.scala:888)at at   org.apache.spark.rdd.RDDOperationScope $ .withScope(RDDOperationScope.scala:147)     在   org.apache.spark.rdd.RDDOperationScope $ .withScope(RDDOperationScope.scala:108)     在org.apache.spark.rdd.RDD.withScope(RDD.scala:306)at   org.apache.spark.rdd.RDD.foreach(RDD.scala:888)at   com.Boot $ .test(Boot.scala:60)at com.Boot $ .main(Boot.scala:36)at at   com.Boot.main(Boot.scala)at   sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     在   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     在java.lang.reflect.Method.invoke(Method.java:606)at   org.apache.spark.deploy.yarn.ApplicationMaster $$匿名$ 2.run(ApplicationMaster.scala:525)   引起:java.io.NotSerializableException:   org.apache.kafka.clients.producer.KafkaProducer序列化堆栈:      - 对象不可序列化(类:org.apache.kafka.clients.producer.KafkaProducer,value:   org.apache.kafka.clients.producer.KafkaProducer@77624599)      - field(类:com.Boot $$ anonfun $ test $ 1,name:producer $ 1,type:class org.apache.kafka.clients.producer.KafkaProducer)      - 对象(类com.Boot $$ anonfun $ test $ 1,)org.apache.spark.serializer.SerializationDebugger $ .improveException(SerializationDebugger.scala:40)     在   org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)     在   org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:84)     在   org.apache.spark.util.ClosureCleaner $ .ensureSerializable(ClosureCleaner.scala:301)

//    @transient
val sparkConf = new SparkConf()

sparkConf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")

//    @transient
val sc = new SparkContext(sparkConf)

val requestSet: RDD[String] = sc.textFile(s"hdfs:/user/bigdata/ADVERTISE-IMPRESSION-STAT*/*")

//    @transient
val props = new HashMap[String, Object]()
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, NearLineConfig.kafka_brokers)
//    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.ByteArraySerializer");
//    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.ByteArraySerializer");
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")
props.put("producer.type", "async")
props.put(ProducerConfig.BATCH_SIZE_CONFIG, "49152")

//    @transient
val producer: KafkaProducer[String, String] = new KafkaProducer[String, String](props)

requestSet.foreachPartition((partisions: Iterator[String]) => {
  partisions.foreach((line: String) => {
    try {
      producer.send(new ProducerRecord[String, String]("testtopic", line))
    } catch {
      case ex: Exception => {
        log.warn(ex.getMessage, ex)
      }
    }
  })
})

producer.close()

在这个程序中,我尝试从hdfs路径读取记录并将它们保存到kafka中。   问题是,当我删除有关向kafka发送记录的代码时,它运行良好。   我错过了什么?

2 个答案:

答案 0 :(得分:9)

KafkaProducer不可序列化。您需要将实例的创建移到foreachPartition

requestSet.foreachPartition((partitions: Iterator[String]) => {
  val producer: KafkaProducer[String, String] = new KafkaProducer[String, String](props)
  partitions.foreach((line: String) => {
    try {
      producer.send(new ProducerRecord[String, String]("testtopic", line))
    } catch {
      case ex: Exception => {
        log.warn(ex.getMessage, ex)
      }
    }
  })
})

请注意,KafkaProducer.send会返回Future[RecordMetadata],如果无法序列​​化密钥或值,则唯一可以传播的异常是SerializationException

答案 1 :(得分:0)

我不推荐尤瓦尔·伊茨恰科夫(Yuval Itzchakov)的回答,因为您打开并关闭了许多套接字,甚至在与kafka的代理中打开连接又沉又慢,所以我强烈建议您阅读此博客https://allegro.tech/2015/08/spark-kafka-integration.html,我曾经用过它并进行测试,这也是我在生产环境中使用的最佳选择。