IntelliJ无法解析符号订阅kafka

时间:2018-05-22 06:56:56

标签: scala apache-spark intellij-idea

我在IntelliJ中有下一个项目,问题是Subscribe内的KafkaUtils.createDirectStream显示为红色,它会抛出Cannot resolve symbol Subscribe,但我添加了所有kafka-spark库:

    import org.apache.spark.streaming.{Seconds, StreamingContext}
    import org.apache.spark.streaming.kafka010.ConsumerStrategies.Subscribe
    import org.apache.spark.streaming.kafka010.KafkaUtils
    import org.apache.spark.streaming.kafka010.LocationStrategies.PreferConsistent

  def startMetaInfoSubscriber(ssc: StreamingContext, kafkaParams: Map[String, Object], metaInfoTopic: String) {
    // Set a unique Kafka group identifier to metaInformationStream (each stream requires a unique group ID)
    val metaInformationKafkaParamas = kafkaParams ++ Map[String, Object]("group.id" -> RandomStringUtils.randomAlphabetic(10).toUpperCase)

    KafkaUtils.createDirectStream[String, String](
      ssc,
      PreferConsistent,
      Subscribe[String, String](metaInfoTopic, metaInformationKafkaParamas)
    ).foreachRDD(metaInfoRDD =>
      if (!metaInfoRDD.isEmpty()) {
        println("Saving MetaInformation")
        metaInfoRDD
//        metaInfoRDD.write.mode("append").format("com.databricks.spark.csv").save(s"hdfs://172.16.8.162:8020/user/sparkload/assetgroup/prueba-kafka")
      } else {
        println("There is not any message for topic 'tu-topic'")
      }
    )
  }

接下来是我的pom.xml:

<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<scala.version>2.11.8</scala.version>
<spark.version>2.3.0</spark.version>
<src.dir>src/main/scala</src.dir>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>${spark.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11 -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>${spark.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming_2.11</artifactId>
    <version>${spark.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka_2.11 -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
    <version>${spark.version}</version>
</dependency>

当我尝试编译时,我有下一个错误:

[ERROR] C:\Users\agomez\Desktop\spark-base\spark-kafka-tfm\src\main\scala\spark_load\EjemploApp.scala:90: error: overloaded method value Subscribe with alternatives:
[ERROR]   (topics: java.util.Collection[String],kafkaParams: java.util.Map[String,Object])org.apache.spark.streaming.kafka010.ConsumerStrategy[String,String] <and>
[ERROR]   (topics: java.util.Collection[String],kafkaParams: java.util.Map[String,Object],offsets: java.util.Map[org.apache.kafka.common.TopicPartition,java.lang.Long])org.apache.spark.streaming.kafka010.ConsumerStrategy[String,String] <and>
[ERROR]   (topics: Iterable[String],kafkaParams: scala.collection.Map[String,Object])org.apache.spark.streaming.kafka010.ConsumerStrategy[String,String] <and>
[ERROR]   (topics: Iterable[String],kafkaParams: scala.collection.Map[String,Object],offsets: scala.collection.Map[org.apache.kafka.common.TopicPartition,scala.Long])org.apache.spark.streaming.kafka010.ConsumerStrategy[String,String]
[ERROR]  cannot be applied to (String, scala.collection.immutable.Map[String,Object])
[ERROR]       Subscribe[String, String](metaInfoTopic, metaInformationKafkaParamas)
[ERROR]                ^
[ERROR] one error found

1 个答案:

答案 0 :(得分:1)

我认为[ {1: {count: 3, totalSum: 15}, {2: {count: 3, totalSum: 141}, {3: {count: 1, totalSum: 0}, {4: {count: 1, totalSum: 0}, ] 的第一个参数应该是主题集合

因此,您需要将多个主题传递为Subscribe()Seq[Strings]。 如果您有单个主题,只需将其作为Array[Strings]

传递
Seq(metaInfoTopic)

希望这有帮助!