在Spark Streaming中创建多个Avro Flume接收器的问题

时间:2017-10-18 20:16:47

标签: apache-spark spark-streaming flume hadoop-streaming flume-ng

我需要使用Spark Streaming连接多个Flume接收器  这是我的水文档:

agent1.sinks.sink1a.type = avro
agent1.sinks.sink1a.hostname = localhost
agent1.sinks.sink1a.port = 9091

agent1.sinks.sink1b.type = avro
agent1.sinks.sink1b.hostname = localhost
agent1.sinks.sink1b.port = 9092

但只有9091端口连接9092无法连接

这是我创建多个Flume Streams的火花代码:

val sparkConf = new SparkConf().setAppName("WordCount")
val ssc = new StreamingContext(sparkConf, Seconds(20))

val rawLines = FlumeUtils.createStream(ssc,"localhost", 9091)
val rawLines1 = FlumeUtils.createStream(ssc,"localhost", 9092)


val lines = rawLines.map{record => {
                (new String(record.event.getBody().array()))}}

val lines1 = rawLines1.map{record1 => {
                (new String(record1.event.getBody().array()))}}

val lines_combined = lines.union(lines1)

val words = lines_combined.flatMap(_.split(" "))

我做错了什么?。关于这一点的任何帮助都会很棒。

0 个答案:

没有答案