在没有context.write的情况下使用MultipleOutput会导致空文件

时间:2017-04-07 07:38:35

标签: java hadoop mapreduce

我不知道如何使用MultipleOutputs类。我用它来创建多个输出文件。以下是我的Driver类的代码段

    Configuration conf = new Configuration();

    Job job = Job.getInstance(conf);
    job.setJarByClass(CustomKeyValueTest.class);//class with mapper and reducer
    job.setOutputKeyClass(CustomKey.class);
    job.setOutputValueClass(Text.class);
    job.setMapOutputKeyClass(CustomKey.class);
    job.setMapOutputValueClass(CustomValue.class);
    job.setMapperClass(CustomKeyValueTestMapper.class);
    job.setReducerClass(CustomKeyValueTestReducer.class);
    job.setInputFormatClass(TextInputFormat.class);

    Path in = new Path(args[1]);
    Path out = new Path(args[2]);
    out.getFileSystem(conf).delete(out, true);

    FileInputFormat.setInputPaths(job, in);
    FileOutputFormat.setOutputPath(job, out);

    MultipleOutputs.addNamedOutput(job, "islnd" , TextOutputFormat.class, CustomKey.class, Text.class);
    LazyOutputFormat.setOutputFormatClass(job, TextOutputFormat.class);
    MultipleOutputs.setCountersEnabled(job, true);

    boolean status = job.waitForCompletion(true);

在Reducer中,我使用了像这样的MultipleOutput,

private MultipleOutputs<CustomKey, Text> multipleOutputs;

@Override
public void setup(Context context) throws IOException, InterruptedException {
    multipleOutputs = new MultipleOutputs<>(context);
}

@Override
public void reduce(CustomKey key, Iterable<CustomValue> values, Context context) throws IOException, InterruptedException {
    ...
     multipleOutputs.write("islnd", key, pop, key.toString());
    //context.write(key, pop);

}

public void cleanup() throws IOException, InterruptedException {
    multipleOutputs.close();
}

}

当我使用context.write时,我得到包含数据的输出文件。但是当我删除context.write时,输出文件为空。但我不想调用context.write,因为它会创建额外的文件part-r-00000。作为陈述here(类的描述中的最后一段),我使用LazyOutputFormat来避免part-r-00000文件。但仍然没有工作。

1 个答案:

答案 0 :(得分:0)

LazyOutputFormat.setOutputFormatClass(job,TextOutputFormat.class);

这意味着,如果您没有创建任何输出,请不要创建空文件。

Can you please look at hadoop counters and find 
 1. map.output.records
 2. reduce.input.groups
 3. reduce.input.records  to verify if your mappers are sending any data to mapper.

用于多输出的IT代码 http://bytepadding.com/big-data/map-reduce/multipleoutputs-in-map-reduce/