杀死Hadoop 2.2.0配置弃用信息消息的确切步骤

时间:2014-01-20 02:53:23

标签: java hadoop hdfs

此问题与Hadoop 2.2.0 Configuration deprecation类似,但该问题的答案并未解决问题,因此我要求在此问题中采取具体步骤,并提供具体示例。

考虑以下简短的Map-only计划:

import java.io.IOException;
import java.util.*;

import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.mapreduce.lib.*;
import org.apache.hadoop.util.*;


public class Foo {

    //     KEYIN,VALUEIN,KEYOUT,VALUEOUT
    public static class Map extends  Mapper<LongWritable, Text, Text, IntWritable> {
        private final static IntWritable one = new IntWritable(1);
        private Text word = new Text();

        public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
            String line = value.toString();
            StringTokenizer tokenizer = new StringTokenizer(line);
            while (tokenizer.hasMoreTokens()) {
                word.set(tokenizer.nextToken());
                context.write(word, one);
            }
        }
    }

    public static void main(String[] args) throws Exception {
        org.apache.hadoop.mapreduce.Job job = Job.getInstance();
        job.setJarByClass(Foo.class);
        job.setInputFormatClass(TextInputFormat.class);
        job.setOutputFormatClass(TextOutputFormat.class);
        job.setMapperClass(Map.class);

        job.setMapOutputKeyClass(Text.class);
        job.setMapOutputValueClass(IntWritable.class);

        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));
        System.exit(job.waitForCompletion(true)?0:1);
    }
}

对输入运行此代码时:

The big brown cat went down the lazy road. 

使用下面的命令行,假设Input包含上面的输入。

hadoop jar Foo.jar Foo Input Output 

将显示以下消息:

  4/01/19 18:37:36 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
14/01/19 18:37:37 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
14/01/19 18:37:38 INFO input.FileInputFormat: Total input paths to process : 1
14/01/19 18:37:38 INFO mapreduce.JobSubmitter: number of splits:1
14/01/19 18:37:38 INFO Configuration.deprecation: user.name is deprecated. Instead, use mapreduce.job.user.name
14/01/19 18:37:38 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
14/01/19 18:37:38 INFO Configuration.deprecation: mapred.mapoutput.value.class is deprecated. Instead, use mapreduce.map.output.value.class
14/01/19 18:37:38 INFO Configuration.deprecation: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class
14/01/19 18:37:38 INFO Configuration.deprecation: mapred.job.name is deprecated. Instead, use mapreduce.job.name
14/01/19 18:37:38 INFO Configuration.deprecation: mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class
14/01/19 18:37:38 INFO Configuration.deprecation: mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
14/01/19 18:37:38 INFO Configuration.deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
14/01/19 18:37:38 INFO Configuration.deprecation: mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class
14/01/19 18:37:38 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
14/01/19 18:37:38 INFO Configuration.deprecation: mapred.mapoutput.key.class is deprecated. Instead, use mapreduce.map.output.key.class
14/01/19 18:37:38 INFO Configuration.deprecation: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir

上面的示例仅使用org.apache.hadoop.mapreduce,我使用了链接的powerpoint来创建它。

究竟需要更改哪些内容(在此代码或/etc/hadoop中)以使弃用消息消失?

1 个答案:

答案 0 :(得分:0)

它已在更高版本的Apache Hadoop中修复 - https://issues.apache.org/jira/browse/HADOOP-10178

您可以升级Apache Hadoop集群或应用补丁并重新构建它。