火花流+卡夫卡sbt汇编

时间:2016-08-23 17:03:16

标签: apache-spark apache-kafka sbt spark-streaming spark-streaming-kafka

我有火花流+卡夫卡的例子。它适用于IDE。但是当我尝试通过SBT从控制台编译它时,比如 sbt compile 。有错误。

主要课程:

val conf = new SparkConf().setMaster("local[*]").setAppName("KafkaReceiver")
  val ssc = new StreamingContext(conf, Seconds(5))

  val kafkaStream1 = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("t1" -> 5))
  //val kafkaStream2 = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("topic2" -> 5))

  //kafkaStream.fla
  kafkaStream1.print()
  ssc.start()
  ssc.awaitTermination()

错误消息:

[error] bad symbolic reference. A signature in package.class refers to type compileTimeOnly
[error] in package scala.annotation which is not available.
[error] It may be completely missing from the current classpath, or the version on
[error] the classpath might be incompatible with the version used when compiling package.class.
Reference to method any2ArrowAssoc in object Predef should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error]   val kafkaStream1 = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("t1" -> 5))
[error]                                                                                                           ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed

SBT:

    name := "test"
    val sparkVersion = "2.0.0"

    lazy val commonSettings = Seq(
      organization := "com.test",
      version := "1.0",
      scalaVersion := "2.11.8",
      test in assembly := {}
    )    
libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-streaming_2.11" % sparkVersion,
  "org.apache.spark" % "spark-streaming-kafka-0-8_2.11" % sparkVersion
) 

你有想法如何解决它吗?

1 个答案:

答案 0 :(得分:0)

你能分享你的build.sbt吗?其中一个原因可能是导致错误的符号引用"问题是scala版本不匹配。请查看此topic以获取有关该问题的更多详细信息。另外,请确保您使用的Scala版本与spark预期相同,请查看this blog post以获取更多详细信息