现在,我必须使用水槽从kafka中获取数据。我要实现的是每30分钟使用一次来自kafka的数据,并将数据写入文件中,以便进行在线学习。配置如下:
# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1
# Describe/configure the source
a1.sources.r1.type = org.apache.flume.source.kafka.KafkaSource
a1.sources.r1.batchSize = 5000
a1.sources.r1.batchDurationMillis = 2000
a1.sources.r1.kafka.zookeeperConnect = host
a1.sources.r1.kafka.topics = topic
# Use a channel which buffers events in memory
a1.channels.c1.type = file
# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
a1.sinks.k1.type = file_roll
a1.sinks.k1.sink.directory = /home/flume_log
a1.sinks.k1.sink.rollInterval=30
a1.sinks.k1.channel = c1
如果配置有问题。当我运行test.conf时,尽管我生成了一些数据,但无法在文件中获取它。所有文件都是空的