SBT无法导入Kafka编码器/解码器类

时间:2016-04-04 08:22:46

标签: apache-spark sbt apache-kafka spark-streaming kafka-consumer-api

项目设置:

  • 1个制作人 - 序列化对象&发送字节到Kafka
  • 1 spark consumer - 应该在kafka.serializer中使用DefaultDecoder 包消耗字节

问题:

  • SBT导入正确的库(kafka-clients + kafka_2.10)但是 无法在kafka_2.10 jar中找到任何类。
  • 好像是在错误的路径上搜索 (org.apache.spark.streaming.kafka而不是org.apache.kafka)。

错误讯息:

    object serializer is not a member of package org.apache.spark.streaming.kafka [error] 
import kafka.serializer.DefaultDecoder.

SBT-树

    [info]   +-org.apache.spark:spark-streaming-kafka_2.10:1.6.1
    [info]   | +-org.apache.kafka:kafka_2.10:0.8.2.1 [S] <-- **DefaultDecoder is in here 
but SBT can't find it (org.apache.kafka.serialization.DefaultDecoder)**
    [info]   | | +-org.apache.kafka:kafka-clients:0.8.2.1

built.sbt:

  lazy val commonSettings = Seq(
  organization := "org.RssReaderDemo",
  version := "0.1.0",
  scalaVersion := "2.10.6"
)

resolvers += "Artima Maven Repository" at "http://repo.artima.com/releases"

val spark = "org.apache.spark" % "spark-core_2.10" % "1.6.1"
val sparkStreaming = "org.apache.spark" % "spark-streaming_2.10" % "1.6.1"
val sparkStreamKafka = "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1"

// Needed to be able to parse the generated avro JSON schema
val jacksonMapperAsl = "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13"

val scalactic = "org.scalactic" %% "scalactic" % "2.2.6"
val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" % "test"

val avro = "org.apache.avro" % "avro" % "1.8.0"

lazy val root = (project in file(".")).
  settings(commonSettings: _*).
  settings(
    libraryDependencies += spark,
    libraryDependencies += sparkStreaming,
    libraryDependencies += sparkStreamKafka,
    libraryDependencies += jacksonMapperAsl,
    libraryDependencies += scalactic,
    libraryDependencies += scalatest,
    libraryDependencies += avro
  )

2 个答案:

答案 0 :(得分:19)

这与SBT无关。你可能有像

这样的东西
import org.apache.spark.streaming._
import kafka.serializer.DefaultDecoder

由于存在org.apache.spark.streaming.kafka个包,此导入会解析为org.apache.spark.streaming.kafka.serializer.DefaultDecoder。您可以按如下方式导入正确的类:import _root_.kafka.serializer.DefaultDecoder。有关Scala导入的详细信息,请参阅https://wiki.scala-lang.org/display/SYGN/Language+FAQs#LanguageFAQs-HowdoIimport

答案 1 :(得分:0)

你需要&#34;导入kafka.serializer.StringDecoder&#34;之前&#34; import org.apache.spark.streaming ._&#34;。导入顺序可以解决问题。

作品 -

flowField  #look at the code
png()
Walpe.flowField <-flowField(WalpeFun, x.lim = c(0.01, 150), y.lim = c(-1, 50), parameters = c(120.73851, 0.51786, -0.75178, 0.00100, 1.00000, 500, 0.001, 0.01102, 320.995455, 5.582273) , points = 20, add = FALSE, system="two.dim")
Walpe.nullclines <-nullclines(WalpeFun, x.lim = c(0.01, 150), y.lim = c(-1, 50), parameters = c(120.73851, 0.51786, -0.75178, 0.00100, 1.00000, 500, 0.001, 0.01102, 320.995455, 5.582273))

y0 <- matrix(c(8.2, 2), ncol = 2, nrow = 1, byrow = TRUE)

Walpe.trajectory <-trajectory(WalpeFun, y0 = y0, t.end = 100, parameters = c(120.73851, 0.51786, -0.75178, 0.00100, 1.00000, 500, 0.001, 0.01102, 320.995455, 5.582273),system = "two.dim", colour = "black")
dev.off()

例外 -

import kafka.serializer.StringDecoder
import org.apache.spark.streaming._