Apache Kafka Producer配置错误

时间:2016-01-18 09:37:20

标签: apache-kafka kafka-producer-api

根据生产者配置的文档,参考Apache Kafka的0.9.0.0版本:

http://kafka.apache.org/documentation.html#producerconfigs

我需要使用以下属性来指定代理列表:

props.put("bootstrap.servers", "localhost:9092")

这是我的制作人类:

  def main(args: Array[String]) {
    //val conf = new SparkConf().setAppName("VPP metrics producer")
    //val sc = new SparkContext(conf)

    val props: Properties = new Properties()
      props.put("bootstrap.servers", "localhost:9092")
      props.put("key.serializer", "kafka.serializer.StringEncoder")
      props.put("value.serializer", "kafka.serializer.StringEncoder")

    val config = new ProducerConfig(props)
    val producer = new Producer[String, String](config)

    1 to 10000 foreach {
      case i => 
        val jsonStr = getRandomTsDataPoint().toJson.toString()
        println(s"sending message $i to kafka")
        producer.send(new KeyedMessage[String, String]("test_topic", jsonStr))
        println(s"sent message $i to kafka")
    }
  }

这是我的依赖:

object Dependencies {
  val resolutionRepos = Seq(
    "Spray Repository" at "http://repo.spray.cc/"
  )

  object V {
    val spark     = "1.6.0"
    val kafka     = "0.9.0.0"
    val jodaTime  = "2.7"
    val sprayJson = "1.3.2"
    // Add versions for your additional libraries here...
  }

  object Libraries {
    val sparkCore   = "org.apache.spark"           %% "spark-core"            % V.spark 
    val kafka       = "org.apache.kafka"           %% "kafka"                 % V.kafka
    val jodaTime    = "joda-time"                  % "joda-time"              % V.jodaTime
    val sprayJson   = "io.spray"                   %% "spray-json"            % V.sprayJson
  }
}

正如您所看到的,我正在使用0.9.0.0版本的Apache Kafka。当我尝试运行Producer类时,出现以下错误:

Joes-MacBook-Pro:spark-kafka-producer joe$ java -cp target/scala-2.11/spark-example-project-0.1.0-SNAPAHOT.jar com.eon.vpp.MetricsProducer
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Missing required property 'metadata.broker.list'
    at scala.Predef$.require(Predef.scala:219)
    at kafka.utils.VerifiableProperties.getString(VerifiableProperties.scala:177)
    at kafka.producer.ProducerConfig.<init>(ProducerConfig.scala:66)
    at kafka.producer.ProducerConfig.<init>(ProducerConfig.scala:56)
    at com.eon.vpp.MetricsProducer$.main(MetricsProducer.scala:45)
    at com.eon.vpp.MetricsProducer.main(MetricsProducer.scala)

这是为什么?我甚至验证了我的jar文件的内容,它使用了0.9.0.0版本的Apache Kafka! (kafka_2.11-0.9.0.0.jar)

1 个答案:

答案 0 :(得分:1)

Spark 1.6.0目前不支持Kafka 0.9。你将不得不等到Spark 2.0.0。 请检查此问题:https://issues.apache.org/jira/browse/SPARK-12177