Kafka Spark制片人接收器通讯

时间:2017-01-24 18:24:09

标签: scala apache-spark apache-kafka

我正在尝试创建一个基本的Kafka制作人和接收者,以了解事情是如何运作的。下面的代码不会产生任何错误,但我的接收器似乎没有收到任何东西。而且,制作人似乎发送得非常慢。

任何人都可以告诉我为什么接收器不打印任何内容,以及为什么制作人发送的内容如此之慢?

这是我第一次参加的制作人:

import org.apache.http.util.EntityUtils
import java.util.Properties
import org.apache.kafka.common.serialization.StringSerializer
import org.apache.kafka.clients.producer.{KafkaProducer,ProducerRecord}

object ProducerTest{


  def extractOptions(properties: Map[String, Any]): Properties = {
    val props = new Properties()
    properties.foreach { case (key, value) => props.put(key, value.toString)     }
    props
  }


  val mandatoryOptions: Map[String, Any] = Map(
    "bootstrap.servers" -> "127.0.0.1:9092",
    "acks" -> "all",
    "batch.size" -> 16384,
    "linger.ms" -> 1,
    "buffer.memory" -> 33554432,
    "key.serializer" ->     "org.apache.kafka.common.serialization.StringSerializer",
    "value.serializer" ->     "org.apache.kafka.common.serialization.StringSerializer")

  val producer = new KafkaProducer[String, String]    (extractOptions(mandatoryOptions))

  val TOPIC="test"

  def urlRecall {

    println("Hallo")

    val record = new ProducerRecord(TOPIC, "key", "hallo")
    producer.send(record,null)

    urlRecall

  }

  def main(args: Array[String]) {

    urlRecall

  }

  ;
}

下面的代码显示了我在一个单独的窗口中运行的接收器。它没有给出任何错误,但也没有打印任何错误:

import org.apache.kafka.common.serialization.StringDeserializer
import org.apache.spark.streaming.kafka010._
import org.apache.spark.streaming.kafka010.LocationStrategies.PreferConsistent
import org.apache.spark.streaming.kafka010.ConsumerStrategies.Subscribe
import org.apache.spark._
import org.apache.spark.streaming._

object kafkaConsumerTest extends App {

  val kafkaParams = Map[String, Object](
    "bootstrap.servers" -> "127.0.0.1:9092",
    "key.deserializer" -> classOf[StringDeserializer],
    "value.deserializer" -> classOf[StringDeserializer],
    "group.id" -> "1",
    "auto.offset.reset" -> "latest",
    "enable.auto.commit" -> (false: java.lang.Boolean)
  )

  val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")
  val ssc = new StreamingContext(conf, Seconds(1))

  val topics = Array("test")
  val stream = KafkaUtils.createDirectStream[String, String](
    ssc,
    PreferConsistent,
    Subscribe[String, String](topics, kafkaParams)
  )

  stream.print()

  ssc.start()
  ssc.awaitTermination()

}

0 个答案:

没有答案