我正在从kafka读取JSON数据并使用spark解析数据。但我最终得到了JSON解析器问题。代码如下所示:
val Array(zkQuorum, groupId, topics, numThreads) = args
val conf = new SparkConf()
.setAppName("KafkaAggregation")
// create sparkContext
val sc = new SparkContext(conf)
// streaming context
val ssc = new StreamingContext(conf, Seconds(1))
// ssc.checkpoint("hdfs://localhost:8020/usr/tmp/data")
val topicMap = topics.split(",").map((_, numThreads.toInt)).toMap
val lines = KafkaUtils.createStream(ssc, zkQuorum, groupId, topicMap).map((_._2))
val lineJson = lines.map(JSON.parseFull(_))
.map(_.get.asInstanceOf[scala.collection.immutable.Map[String,Any]])
错误详情: 错误:未找到:值JSON [INFO] val lineJson = lines.map(JSON.parseFull(_))
你能帮我解决一下maven依赖应该用来解决错误。
感谢。
答案 0 :(得分:-1)
我认为你正在寻找这个:
import scala.util.parsing.json._
添加Maven:
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-parser-combinators</artifactId>
<version>2.11.0-M4</version>
</dependency>
https://maven-repository.com/artifact/org.scala-lang/scala-parser-combinators/2.11.0-M4