保存后点燃缓存为空吗?

时间:2019-05-13 10:44:04

标签: spark-streaming rdd ignite

我的数据管道如下:Kafka =>执行一些计算=>将结果对装入Ignite cache =>打印出来

 SparkConf conf = new SparkConf().setMaster("local[*]").setAppName("MainApplication");
 JavaSparkContext sc = new JavaSparkContext(conf);
 JavaStreamingContext streamingContext = new JavaStreamingContext(sc, Durations.seconds(10));
 JavaIgniteContext<String, Float> igniteContext = new JavaIgniteContext<>(sc, PATH, false);

 JavaDStream<Message> dStream = KafkaUtils.createDirectStream(
         streamingContext,
         LocationStrategies.PreferConsistent(),
         ConsumerStrategies.<String, Message>
                 Subscribe(Collections.singletonList(TOPIC), kafkaParams)
 )
         .map(ConsumerRecord::value);

 JavaPairDStream<String, Message> pairDStream =
         dStream.mapToPair(message -> new Tuple2<>(message.getName(), message));

 JavaPairDStream<String, Float> pairs = pairDStream
         .combineByKey(new CreateCombiner(), new MergeValue(), new MergeCombiners(), new HashPartitioner(10))
         .mapToPair(new ToPairTransformer());

 JavaIgniteRDD<String, Float> myCache = igniteContext.fromCache(new CacheConfiguration<>());

  // I know that we put something here:
  pairDStream.foreachRDD((VoidFunction<JavaPairRDD<String, Float>>) myCache::savePairs);

  // But I can't see anything here:
  myCache.foreach(tuple2 -> System.out.println("In cache: " + tuple2._1() + " = " + tuple2._2()));

  streamingContext.start();
  streamingContext.awaitTermination();
  streamingContext.stop();
  sc.stop();

但是此代码不打印任何内容。为什么?

为什么Ignite cache之后savePairs还是空的?

这有什么问题吗?

谢谢!

1 个答案:

答案 0 :(得分:2)

对我来说,pairDStream.foreachRDD(...)似乎是一个懒惰的操作,至少在开始流式传输上下文streamingContext.start()之前没有任何影响。 另一方面,myCache.foreach(...)是急切的操作,您可以在实际上空的缓存上执行它。 因此,尝试在流上下文开始后放置myCache.foreach(...)。甚至终止之后。