Hadoop映射器可以在输出中生成多个键吗?

时间:2011-05-25 16:40:13

标签: hadoop key mapper

单个Mapper类可以在一次运行中生成多个键值对(相同类型)吗?

我们在mapper中输出键值对,如下所示:

context.write(key, value);

这是Key的简化版(示例):

import java.io.DataInput;
import java.io.DataOutput;
import java.io.IOException;

import org.apache.hadoop.io.ObjectWritable;
import org.apache.hadoop.io.WritableComparable;
import org.apache.hadoop.io.WritableComparator;


public class MyKey extends ObjectWritable implements WritableComparable<MyKey> {

    public enum KeyType {
        KeyType1,
        KeyType2
    }

    private KeyType keyTupe;
    private Long field1;
    private Integer field2 = -1;
    private String field3 = "";


    public KeyType getKeyType() {
        return keyTupe;
    }

    public void settKeyType(KeyType keyType) {
        this.keyTupe = keyType;
    }

    public Long getField1() {
        return field1;
    }

    public void setField1(Long field1) {
        this.field1 = field1;
    }

    public Integer getField2() {
        return field2;
    }

    public void setField2(Integer field2) {
        this.field2 = field2;
    }


    public String getField3() {
        return field3;
    }

    public void setField3(String field3) {
        this.field3 = field3;
    }

    @Override
    public void readFields(DataInput datainput) throws IOException {
        keyTupe = KeyType.valueOf(datainput.readUTF());
        field1 = datainput.readLong();
        field2 = datainput.readInt();
        field3 = datainput.readUTF();
    }

    @Override
    public void write(DataOutput dataoutput) throws IOException {
        dataoutput.writeUTF(keyTupe.toString());
        dataoutput.writeLong(field1);
        dataoutput.writeInt(field2);
        dataoutput.writeUTF(field3);
    }

    @Override
    public int compareTo(MyKey other) {
        if (getKeyType().compareTo(other.getKeyType()) != 0) {
            return getKeyType().compareTo(other.getKeyType());
        } else if (getField1().compareTo(other.getField1()) != 0) {
            return getField1().compareTo(other.getField1());
        } else if (getField2().compareTo(other.getField2()) != 0) {
            return getField2().compareTo(other.getField2());
        } else if (getField3().compareTo(other.getField3()) != 0) {
            return getField3().compareTo(other.getField3());
        } else {
            return 0;
        }
    }

    public static class MyKeyComparator extends WritableComparator {
        public MyKeyComparator() {
            super(MyKey.class);
        }

        public int compare(byte[] b1, int s1, int l1, byte[] b2, int s2, int l2) {
            return compareBytes(b1, s1, l1, b2, s2, l2);
        }
    }

    static { // register this comparator
        WritableComparator.define(MyKey.class, new MyKeyComparator());
    }
}

这就是我们尝试在Mapper中输出两个键的方法:

MyKey key1 = new MyKey();
key1.settKeyType(KeyType.KeyType1);
key1.setField1(1L);
key1.setField2(23);

MyKey key2 = new MyKey();
key2.settKeyType(KeyType.KeyType2);
key2.setField1(1L);
key2.setField3("abc");

context.write(key1, value1);
context.write(key2, value2);

我们的作业输出格式类是:org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat

我说这是因为在其他输出格式类中我看到输出没有附加,只是提交了write方法的实现。

此外,我们正在使用Mapper和Context的以下类: org.apache.hadoop.mapreduce.Mapper org.apache.hadoop.mapreduce.Context

1 个答案:

答案 0 :(得分:10)

在一个地图任务中多次写入上下文完全没问题。

但是,您的密钥类可能有几个问题。每当您为密钥实施WritableComparable时,您还应该实施equals(Object)hashCode()方法。这些不是WritableComparable接口的一部分,因为它们是在Object中定义的,但您必须提供实现。

默认分区程序使用hashCode()方法来确定每个键/值对转到哪个reducer。如果你没有提供一个理智的实现,你可能会得到奇怪的结果。

根据经验,每当您实施hashCode()或任何类型的比较方法时,您都应该提供equals(Object)方法。您必须确保它接受Object作为参数,因为这是Object类中的定义(您可能会覆盖其实现)。