将数据从本地目录传输到Azure Data Lake存储中,无需排屑槽

时间:2018-11-19 17:18:05

标签: hdfs flume azure-data-lake hdinsight flume-ng

**我正在尝试提取位于vm linux目录中的数据      到目录从带有水槽的Azure数据湖存储中,我不知道要使用什么类型的接收器,或者甚至有可能知道对于源,我使用了spooldir **

**************我的水槽conf文件*********

a1是我的水槽代理

a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type = spooldir
a1.sources.r1.spoolDir = home/
a1.sources.r1.fileHeader = true

#describe the sink ( here is my problem  i don't know what type of sink to use i don't find any documentation about it)
a1.sinks.k1.type = 
a1.sinks.k1.endpoint = 


# Use a channel which buffers events in memory

a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

0 个答案:

没有答案