Hadoop:无法找到课程

时间:2014-11-29 13:57:27

标签: hadoop bigdata

我刚刚开始学习hadoop并遵循“Hadoop - The Definitive Guide”。

我测试了第一种编写Map和Reduce类的方法,其中Mapper和Reducer是接口。代码工作正常。 然后我开始编写代码,其中Map和Reduce是带有Context类的抽象类。 顺便说一下,我正在使用hadoop 1.2.1  我看到以下错误

MaxTemperatureReducer.java:5: error: cannot find symbol
public class MaxTemperatureReducer extends Reducer<Text, IntWritable, Text, IntWritable> 
                                       ^
  symbol: class Reducer
MaxTemperatureReducer.java:7: error: cannot find symbol
    public void reduce(Text key, Iterable<IntWritable> values, Context context) throws IOException,InterruptedException 
                                                           ^
  symbol:   class Context
  location: class MaxTemperatureReducer
MaxTemperature.java:5: error: cannot find symbol
import org.apache.hadoop.mapreduce.FileInputFormat;
                              ^
  symbol:   class FileInputFormat
  location: package org.apache.hadoop.mapreduce
MaxTemperature.java:6: error: cannot find symbol
import org.apache.hadoop.mapreduce.FileOutputFormat;
                              ^
  symbol:   class FileOutputFormat
  location: package org.apache.hadoop.mapreduce
MaxTemperature.java:7: error: cannot find symbol
import org.apache.hadoop.mapreduce.JobClient;

我的Mapper类看起来像

import java.io.IOException;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
//import org.apache.hadoop.mapreduce.MapReduceBase;
import org.apache.hadoop.mapreduce.Mapper;
public class MaxTemperatureMapper extends Mapper<LongWritable, Text, Text, IntWritable>
{
    private static final int MISSING = 9999;
    public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException
    {
    String line = value.toString();
    String year = line.substring(0, 4);
    int airTemperature;
    if (line.charAt(4) == '+')
    {
        airTemperature = Integer.parseInt(line.substring(5, 10));
    }
    else
    {
        System.out.println( line );
        airTemperature = Integer.parseInt(line.substring(4, 9));
    }
    System.out.println( "Mapper: " + year + ", " + airTemperature );
    context.write(new Text(year), new IntWritable(airTemperature));
    }
}

有人可以帮忙吗?

2 个答案:

答案 0 :(得分:0)

似乎问题出在您的Driver类的import语句中,您试图使用FileInputFormatFileOutputFormat的错误导入语句。对FileInputFormat&amp;您的驱动程序(FileOutputFormat)中的MaxTemperature

import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

使用此example作为参考,以确保所有导入语句都有效。

答案 1 :(得分:0)

您的导入声明不正确。这些类文件在org.apache.hadoop.mapred包或org.apache.hadoop.mapreduce.lib.input

中可用。

与FileOutputFormat相似的问题