火花流不工作

时间:2015-06-19 22:55:45

标签: apache-spark spark-streaming

我有一个基本的火花流动字数,它只是不起作用。

import sys
from pyspark import SparkConf, SparkContext
from pyspark.streaming import StreamingContext

sc = SparkContext(appName='streaming', master="local[*]")
scc = StreamingContext(sc, batchDuration=5)

lines = scc.socketTextStream("localhost", 9998)
words = lines.flatMap(lambda line: line.split())
counts = words.map(lambda word: (word, 1)).reduceByKey(lambda x, y: x + y)

counts.pprint()

print 'Listening'
scc.start()
scc.awaitTermination()  

我在另一个运行nc -lk 9998的终端上,我粘贴了一些文字。它打印出典型的日志(没有例外),但最终排队工作一段奇怪的时间(45年),并继续打印这个......

15/06/19 18:53:30 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:874
15/06/19 18:53:30 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 2 (PythonRDD[7] at RDD at PythonRDD.scala:43)
15/06/19 18:53:30 INFO TaskSchedulerImpl: Adding task set 2.0 with 1 tasks
15/06/19 18:53:35 INFO JobScheduler: Added jobs for time 1434754415000 ms
15/06/19 18:53:40 INFO JobScheduler: Added jobs for time 1434754420000 ms
15/06/19 18:53:45 INFO JobScheduler: Added jobs for time 1434754425000 ms
...
...

我做错了什么?

1 个答案:

答案 0 :(得分:1)

Spark Streaming需要多个执行程序才能工作。尝试使用本地[4]作为主人。