无法在hadoop 2.7中运行map reduce作业 - 类型不匹配

时间:2016-09-08 14:29:25

标签: java hadoop hadoop2

在运行程序时获取Error: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, received org.apache.hadoop.io.LongWritable

我尝试了来自google / stack网站的更多建议。但没有运气。仍然得到相同的例外。任何想法,在哪里/我错过了什么?

我的进口

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path; 
import org.apache.hadoop.io.IntWritable; 
import org.apache.hadoop.io.LongWritable; 
import org.apache.hadoop.io.Text; 
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;

我的地图课程

public static class Map extends Mapper<LongWritable, Text, Text, IntWritable> 
{
    Text k = new Text();


    public void map(Text key, Iterable<IntWritable> value, Context context) 
                throws IOException, InterruptedException {
        String line = value.toString(); 
        StringTokenizer tokenizer = new StringTokenizer(line," "); 
        while (tokenizer.hasMoreTokens()) { 
            String year= tokenizer.nextToken();
            k.set(year);
            String temp= tokenizer.nextToken().trim();
            int v = Integer.parseInt(temp); 
            context.write(k,new IntWritable(v)); 
        }
    }
}

我的减少课程

public static class Reduce extends Reducer<Text, IntWritable, Text, IntWritable>
{

    public void reduce (Text key, Iterable<IntWritable> values, Context context)
            throws IOException, InterruptedException {
        int maxtemp=0;
        for(IntWritable it : values) {
            int temperature= it.get();
            if(maxtemp<temperature)
            {
                maxtemp =temperature;
            }
        }
        context.write(key, new IntWritable(maxtemp)); 
    }
}

主要

Configuration conf = new Configuration();

Job job = new Job(conf, "MaxTemp");
job.setJarByClass(MaxTemp.class);
job.setMapperClass(Mapper.class);
job.setReducerClass(Reducer.class);

job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(IntWritable.class);

job.setInputFormatClass(TextInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);

Path outputPath = new Path(args[1]);

FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));

outputPath.getFileSystem(conf).delete(outputPath);

System.exit(job.waitForCompletion(true) ? 0 : 1);

(我在Eclipse IDE(Mars)中用Java 7编译了这段代码 - 导出为runnable jar,Hadoop版本为2.7.0)

2 个答案:

答案 0 :(得分:1)

如果您在@Override功能中添加map注释,则会发现它并未覆盖map中的Mapper方法。

如果您查看Javadoc for Mapper(link here),您会发现map方法应如下所示:

map(KEYIN key, VALUEIN value, org.apache.hadoop.mapreduce.Mapper.Context context)

你的看起来像什么

map(Text key, Iterable<IntWritable> value, Context context)

所以你的应该是:

map(LongWritable key, Text value, Context context)

因为您实际上并没有覆盖map中的基类Mapper类,所以您的方法不会被Mapper中的protected void map(KEYIN key, VALUEIN value, Context context) throws IOException, InterruptedException { context.write((KEYOUT) key, (VALUEOUT) value); } 调用,它看起来像:

LongWritable

这将接收TextText并将它们写回(身份映射器),它与您IntWritablejob.setMapperClass(Mapper.class); job.setReducerClass(Reducer.class); 无法匹配他们应该是。

在你的驱动程序中这些行:

job.setMapperClass(Map.class);
job.setReducerClass(Reduce.class);

应该更像是:

{{1}}

您需要使用您的实现而不是基类。

答案 1 :(得分:1)

您的映射器定义LongWritable定义Key为Text和值public void map(Text key, Iterable<IntWritable> value, Context context)

但是,您的地图方法TextIterable<IntWritable>定义为关键字,public void map(LongWritable key, Text value, Context context)定义为值。

因此,您的地图方法应定义为renderOptions(){ let dataRef = this.props.designs.items, tagArray = [], newArr = [] return(_.map(dataRef, (design) => { let tags = design.tags, finalArr return(_.map(tags, (tag) =>{ tagArray.push(tag) newArr = [ ...new Set(tagArray) ] return(_.map(newArr, (item) => { console.log(item) return( <li><h1>{item}</h1></li> ) })) }) ) })) }