如何从Map-Reduce中的多个目录中读取多个文件

时间:2011-12-28 13:37:09

标签: java hadoop

我想在Map-Reduce程序中读取多个目录中的多个文件。 我试图在main方法中给出文件名:

FileInputFormat.setInputPaths(conf,new Path("hdfs://localhost:54310/user/test/"));
FileInputFormat.setInputPaths(conf,new Path("hdfs://localhost:54310/Test/test1/"));

但它只是从一个文件中读取。

我应该怎样做才能阅读多个文件?

请提出解决方案。

感谢。

2 个答案:

答案 0 :(得分:5)

FileInputFormat#setInputPaths将覆盖先前设置的输入路径后设置输入路径。使用FileInputFormat#addInputPathFileInputFormat#addInputPaths添加到现有路径。

答案 1 :(得分:0)

Follow the below steps for passsing multiple input files from different direcories.Just driver code changes.Follow the below driver code.
CODE:
public int run(String[] args) throws Exception {
        Configuration conf=new Configuration();
        Job job=Job.getInstance(conf, "MultipleDirectoryAsInput");

        job.setMapperClass(Map1Class.class);
        job.setMapperClass(Map2Class.class);
        job.setReducerClass(ReducerClass.class);        
         job.setJarByClass(DriverClass.class);      
        job.setMapOutputKeyClass(Text.class);
        job.setMapOutputValueClass(IntWritable.class);      
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(NullWritable.class);        
        //FileInputFormat.setInputPaths(job, new Path(args[0]));        
        MultipleInputs.addInputPath(job, new Path(args[0]),TextInputFormat.class,Map1Class.class);
        MultipleInputs.addInputPath(job, new Path(args[1]), TextInputFormat.class, Map2Class.class);            
        FileOutputFormat.setOutputPath(job, new Path(args[2])); 
        return job.waitForCompletion(true)?0:1;     
    }