Hadoop - MultipleInputs

时间:2014-10-13 14:03:37

标签: hadoop mapreduce apache-pig

我正在尝试使用Hadoop中的MultipleInputs。我的所有映射器都是FixedLengthInputFormat。

MultipleInputs.addInputPath(job, 
                    new Path(rootDir),       
                    FixedLengthInputFormat.class, 
                    OneToManyMapper.class);

问题是每个映射器都有不同大小的固定记录宽度。

  

config.setInt(FixedLengthInputFormat.FIXED_RECORD_LENGTH,??);

无论如何使用MultipleInputs为每个映射器传递FIXED_RECORD_LENGTH?

谢谢!

1 个答案:

答案 0 :(得分:1)

以下是解决方案:

public class CustomFixedLengthInputFormat extends FixedLengthInputFormat{

    @Override
    public RecordReader<LongWritable, BytesWritable> createRecordReader(
            InputSplit split, TaskAttemptContext context) throws IOException,
            InterruptedException {
        //here i can control de recordLength size!
        int recordLength = ??;// getRecordLength(context.getConfiguration());
        if (recordLength <= 0) {
            throw new IOException(
                    "Fixed record length "
                            + recordLength
                            + " is invalid.  It should be set to a value greater than zero");
        }

        System.out.println("Record Length: " + recordLength);

        return new FixedLengthRecordReader(recordLength);
    }

}