在HDFS中创建文件但不附加任何内容

时间:2015-03-23 00:15:17

标签: hadoop cloudera flume hortonworks-data-platform flume-ng

我正在使用HTTP-Source将JSON文件放入HDFS(单节点SANDBOX)。

文件在正确的目录中创建,但文件中没有附加任何内容。在开始调试HTTP-Source之前,您能验证我的flume.conf吗?

#################################################################
# Name the components on this agent
#################################################################

hdfs-agent.sources = httpsource
hdfs-agent.sinks = hdfssink
hdfs-agent.channels = channel1

#################################################################
# Describe source
#################################################################

# Source node
hdfs-agent.sources.httpsource.type = http 
hdfs-agent.sources.httpsource.port = 5140
hdfs-agent.sources.httpsource.handler = org.apache.flume.source.http.JSONHandler

#################################################################
# Describe Sink
#################################################################

# Sink hdfs
hdfs-agent.sinks.hdfssink.type = hdfs
hdfs-agent.sinks.hdfssink.hdfs.path = hdfs://sandbox:8020/user/flume/node
hdfs-agent.sinks.hdfssink.hdfs.fileType = DataStream
hdfs-agent.sinks.hdfssink.hdfs.batchSize = 1
hdfs-agent.sinks.hdfssink.hdfs.rollSize = 0
hdfs-agent.sinks.hdfssink.hdfs.rollCount = 0

#################################################################
# Describe channel
#################################################################

# Channel memory
hdfs-agent.channels.channel1.type = memory
hdfs-agent.channels.channel1.capacity = 1000
hdfs-agent.channels.channel1.transactionCapacity = 100


#################################################################
# Bind the source and sink to the channel
#################################################################

hdfs-agent.sources.httpsource.channels = channel1
hdfs-agent.sinks.hdfssink.channel = channel1

我目前正试图通过开始小测试来测试它:

[{"text": "Hi Flume this Node"}]

所以我认为我的batchSize / rollSize / rollCount可能是问题吗?

2 个答案:

答案 0 :(得分:2)

batchSize,rollSize,rollCount值很好。 将rollSize和rollCount设置为0将禁用文件滚动功能。

hdfs-agent.sources.httpsource.type应设为org.apache.flume.source.http.HTTPSource

发送到http源的数据格式应为

[{"headers" : {"a":"b", "c":"d"},"body": "random_body"}, {"headers" : {"e": "f"},"body": "random_body2"}].

我使用您使用的数据测试了发送([{“text”:“Hi Flume this Node”}])。没有任何东西被附加到我的文件,因为没有“body”属性。但是当我发布以下内容时,数据被附加到我的文件中。

 curl -X POST -H 'Content-Type: application/json; charset=UTF-8' -d '[{  "headers" : {           "timestamp" : "434324343", "host" :"random_host.example.com", "field1" : "val1"            },  "body" : "random_body"  }]' http://localhost:5140.

希望这有帮助

答案 1 :(得分:1)

像arathim所指出的那样,org.apache.flume.source.http.JSONHandler期待Flume事件格式。如果你想假脱机自己的JSON,你需要创建自己的处理程序。这是一个采用任何JSON的处理程序的示例:

public class GenericJSONInputHandler implements HTTPSourceHandler {

    protected static final String TIMESTAMP = "timestamp";
    private static final Logger LOG = LoggerFactory.getLogger(GenericJSONInputHandler.class);
    protected static final String TYPE = "type";


    public GenericJSONInputHandler() {
    }

    /**
     * {@inheritDoc}
     */
    @Override
    public List<Event> getEvents(HttpServletRequest request) throws Exception {
        BufferedReader reader = request.getReader();
        String charset = request.getCharacterEncoding();
        // UTF-8 is default for JSON. If no charset is specified, UTF-8 is to
        // be assumed.
        if (charset == null) {
            LOG.debug("Charset is null, default charset of UTF-8 should be used.");
        }

        List<Event> eventList = new ArrayList<Event>(0);
        try {
            String json = reader.readLine();
            LOG.debug("Received line with size " + json.length());
            while (json != null) {
                List<Event> e = createEvents(json);
                if (e !=null) {
                    eventList.addAll(e);
                }
                json = reader.readLine();
            }
        }
        catch (Exception ex) {
            throw new HTTPBadRequestException("Request has invalid JSON Syntax.", ex);
        }

        return eventList;
    }

    protected List<Event> createEvents(String json) {
        try {
            if (isValidJSON(json)) {
                Map<String, String> headers = new HashMap<>();
                headers.put(TIMESTAMP, String.valueOf(System.currentTimeMillis()));
                headers.put(TYPE, "default");
                return Arrays.asList(EventBuilder.withBody(json.getBytes(), headers));
            }
        } catch (Exception e) {
            e.printStackTrace();
        }
        return null;
    }

    public boolean isValidJSON(final String json) {
        boolean valid = false;
        try {
            final JsonParser parser = new ObjectMapper().getFactory()
                .createParser(json);
            while (parser.nextToken() != null) {
            }
            valid = true;
        }
        catch (JsonParseException jpe) {
            jpe.printStackTrace();
        }
        catch (IOException ioe) {
            ioe.printStackTrace();
        }

        return valid;
    }

    @Override
    public void configure(Context context) {
    }

}