如何在hadoop中将系统属性传递给map函数

时间:2013-07-20 07:22:19

标签: hadoop configuration mapreduce hbase

有没有办法在hadoop map reduce框架中将系统参数(如-Dmy_param = XXX)传递给map函数。 通过.setJarByClass()向hadoop集群提交作业。 在mapper中我必须创建配置,所以我想使它变得可配,所以我认为通过属性文件的标准方式是可以的。只是努力传递属性设置的参数。另一种方法是将属性文件添加到提交的jar中。有人有经验如何解决这个问题?

1 个答案:

答案 0 :(得分:8)

如果您尚未在工作中使用此功能,可以尝试使用GenericOptionsParser,Tool和ToolRunner来运行Hadoop Job。

注意: MyDriver扩展了Configured并实现了工具。 并且,为了让你的工作使用这个

hadoop -jar somename.jar MyDriver -D your.property=value arg1 arg2

有关详细信息,请check this link

以下是我为您准备的一些示例代码:

public class MyDriver extends Configured implements Tool {

  public static class MyDriverMapper extends Mapper<LongWritable, Text, LongWritable, NullWritable> {

    protected void map(LongWritable key, Text value, Context context)
      throws IOException, InterruptedException {
      // In the mapper you can retrieve any configuration you've set
      // while starting the job from the terminal as shown below

      Configuration conf = context.getConfiguration();
      String yourPropertyValue = conf.get("your.property");
    }
  }

  public static class MyDriverReducer extends Reducer<LongWritable, NullWritable, LongWritable, NullWritable> {

    protected void reduce(LongWritable key, Iterable<NullWritable> values, Context context) 
      throws IOException, InterruptedException {
      // --- some code ---
    }
  }

  public static void main(String[] args) throws Exception {
    int exitCode = ToolRunner.run(new MyDriver(), args);
    System.exit(exitCode);
  }

  @Override
  public int run(String[] args) throws Exception {
    Configuration conf = getConf();
    // if you want you can get/set to conf here too.
    // your.property can also be file location and after
    // you retrieve the properties and set them one by one to conf object.

    // --other code--//
    Job job = new Job(conf, "My Sample Job");
    // --- other code ---//
    return (job.waitForCompletion(true) ? 0 : 1);
  }
}