Spark Streaming不消耗消息

时间:2018-12-10 19:45:16

标签: apache-spark hadoop spark-streaming cloudera-cdh

当我尝试运行以下Spark Streaming(V 1.6)作业时,它没有消耗任何消息,而是在我使用Ctrl + C终止该作业的那一刻。它从套接字打印消息并退出程序

`SparkConf sparkConf = new SparkConf().setAppName("WordCountSocketEx");     
    JavaStreamingContext streamingContext = new JavaStreamingContext(sparkConf, Durations.seconds(Integer.parseInt(args[0])));
    JavaReceiverInputDStream<String> StreamingLines = streamingContext.socketTextStream(args[1], Integer.parseInt(args[2]),
            StorageLevels.MEMORY_AND_DISK_SER);

    JavaDStream<String> words = StreamingLines.flatMap(new FlatMapFunction<String, String>() {

        public Iterable<String> call(String str) throws Exception {
            System.out.println("Msg recieved is"+str);
            return Arrays.asList(str.split(" "));
        }

    });

    JavaPairDStream<String, Integer> wordCounts = words.mapToPair(new PairFunction<String, String, Integer>() {
        @Override
        public Tuple2<String, Integer> call(String str) {
            return new Tuple2<>(str, 1);
        }
    }).reduceByKey(new Function2<Integer, Integer, Integer>() {
        @Override
        public Integer call(Integer count1, Integer count2) {
            return count1 + count2;
        }
    });
    wordCounts.print();
    streamingContext.start();
    streamingContext.awaitTermination();`

Spark提交命令为

spark-submit --name test --master yarn --deploy-mode client --num-executors 3 --conf  spark.dynamicAllocation.enabled=false  --class sparkTest.WordCountSocketEx /var/log/sparkTest-0.0.1-SNAPSHOT.jar 60 localhost 9999

以下是我的RM UI屏幕截图enter image description here

和Spark Executors屏幕快照enter image description here

0 个答案:

没有答案