我正在尝试使用Spark Streaming从Kafka主题中读取记录。
这是我的代码:
object KafkaConsumer {
import ApplicationContext._
def main(args: Array[String]) = {
val kafkaParams = Map[String, Object](
"bootstrap.servers" -> "localhost:9092",
"key.deserializer" -> classOf[StringDeserializer],
"value.deserializer" -> classOf[StringDeserializer],
"group.id" -> s"${UUID.randomUUID().toString}",
"auto.offset.reset" -> "earliest",
"enable.auto.commit" -> (false: java.lang.Boolean)
)
val topics = Array("pressure")
val stream = KafkaUtils.createDirectStream[String, String](
streamingContext,
PreferConsistent,
Subscribe[String, String](topics, kafkaParams)
)
stream.print()
stream.map(record => (record.key, record.value)).count().print()
streamingContext.start()
}
}
我运行时没有显示任何内容。
要检查pressure
主题中是否确实存在数据,我使用命令行方法并显示记录:
bin/kafka-console-consumer.sh \
--bootstrap-server localhost:9092 \
--topic pressure \
--from-beginning
输出:
TimeStamp:07/13/16 15:20:45:226769,{'Pressure':'834'}
TimeStamp:07/13/16 15:20:45:266287,{'Pressure':'855'}
TimeStamp:07/13/16 15:20:45:305694,{'Pressure':'837'}
怎么了?
答案 0 :(得分:7)
您错过了if let weatherArray = weatherDict["weather"] as? [[String:Any]],
let weather = weatherArray.first {
print(weather["description"]) // the value is an optional.
}
。
答案 1 :(得分:-2)
您需要启动streamingContext
,最后执行streamingContext.awaitTermination()
。