Spark流媒体支持现在支持Kafka 1.1.0吗?

时间:2018-05-04 02:22:54

标签: apache-spark

现在火花版是2.3。 我见过maven中央存储库:https://search.maven.org/#search%7Cga%7C1%7Cg%3A%22org.apache.spark%22

显示的jar是spark-streaming-kafka-0-10_2.11

所以现在不支持kafka1.1.0?

我仍然应该安装kafka 0.10.x

2 个答案:

答案 0 :(得分:1)

基于the following link: 您应该将spark-streaming-kafka-0-10用于kafka 0.10.0或更高

答案 1 :(得分:0)

我使用jars

spark2.3 kafka1.1.0进行了测试
<dependency>
     <groupId>org.apache.spark</groupId>
     <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
     <version>${spark.version}</version>
</dependency>

运行良好。

示例代码:

    SparkConf conf = new SparkConf().setAppName("stream test").setMaster("local[*]");
    JavaStreamingContext streamingContext = new JavaStreamingContext(conf, Durations.seconds(2));

    Map<String, Object> kafkaParams = new HashMap<>();
    kafkaParams.put("bootstrap.servers", "master:9092");
    kafkaParams.put("key.deserializer", StringDeserializer.class);
    kafkaParams.put("value.deserializer", StringDeserializer.class);
    kafkaParams.put("group.id", "use_a_separate_group_id_for_each_stream");
    kafkaParams.put("enable.auto.commit", false);

    List<String> topics = Arrays.asList("A29");

    JavaInputDStream<ConsumerRecord<String, String>> stream = KafkaUtils.createDirectStream(
            streamingContext,
            LocationStrategies.PreferConsistent(),
            ConsumerStrategies.<String, String>Subscribe(topics, kafkaParams)
    );

    JavaDStream<String> lines = stream.map(ConsumerRecord::value);

    lines.print(30);

    streamingContext.start();
    streamingContext.awaitTermination();