如何在Java应用程序中使用scala类JavaDStreamKafkaWriter?

时间:2015-11-20 19:24:24

标签: java eclipse scala maven

我在eclipse中使用Maven项目。它不会编译。

JavaRDDKafkaWriter<String> writer = JavaRDDKafkaWriterFactory.fromJavaRDD(inrdd);
writer.writeToKafka(producerConf, new ProcessingFunc());    

JavaDStreamKafkaWriter类无法解析为某个类型。我已经包含了maven依赖。

    <dependency>
        <groupId>org.cloudera.spark.streaming.kafka</groupId>
        <artifactId>spark-kafka-writer</artifactId>
        <version>0.1.1-SNAPSHOT</version>
    </dependency>

有关详细信息,请参阅https://github.com/cloudera/spark-kafka-writer

1 个答案:

答案 0 :(得分:1)

首先从https://github.com/cloudera/spark-kafka-writer下载源代码然后在本地构建快照,我做了同样的事情。这在我当地的maven repsoitory创建了罐子。

对于DStream代码将是

 final Properties properties = new Properties();
 properties.put("metadata.broker.list","localhost:9092");
 properties.put("serializer.class", "kafka.serializer.StringEncoder");

 final JavaDStreamKafkaWriter<String> writer =
        JavaDStreamKafkaWriterFactory.fromJavaDStream(JavaDStream);

  writer.writeToKafka(properties, msg -> new KeyedMessage<>(TOPIC,msg));