我无法理解Hadoop中的清理方法究竟是什么,它是如何运作的?我有以下Map-Reduce代码来计算一堆数字的最大值,最小值和平均值。
public class Statistics
{
public static class Map extends Mapper<LongWritable, Text, Text, Text>
{
public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException
{
/* code to calculate min, max, and mean from among a bunch of numbers */
}
public void cleanup(Context context) throws IOException, InterruptedException
{
Text key_min = new Text();
key_min.set("min");
Text value_min = new Text();
value_min.set(String.valueOf(min));
context.write(key_min,value_min);
Text key_max = new Text();
key_max.set("max");
Text value_max = new Text();
value_max.set(String.valueOf(max));
context.write(key_max,value_max);
Text key_avg = new Text();
key_avg.set("avg");
Text value_avg = new Text();
value_avg.set(String.valueOf(linear_sum)+","+count);
context.write(key_avg,value_avg);
Text key_stddev = new Text();
key_stddev.set("stddev");
Text value_stddev = new Text();
value_stddev.set(String.valueOf(linear_sum)+","+count+","+String.valueOf(quadratic_sum));
context.write(key_stddev,value_stddev);
}
}
public static class Reduce extends Reducer<Text,Text,Text,Text>
{
public void reduce(Text key, Iterable<Text> values,Context context) throws IOException, InterruptedException
{
/* code to further find min, max and mean from among the outputs of different mappers */
}
}
public static void main(String[] args) throws Exception
{
/* driver program */
}
}
那么cleanup(Context context)
方法到底在做什么呢?我假设它从一堆映射器收集输出(键,值)对并将其传递给reducer。在其他网站上,我已经读过在MapReduce中运行的订单是:setup - &gt;地图 - &gt;清理然后设置 - &gt;减少 - &gt;清理。为什么这个程序没有使用设置方法?
答案 0 :(得分:1)
这些值必须不在Mapper中计算,必须在Reduce步骤上计算。 https://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduceTutorial.html#Reducer