错误的值类:org.apache.mahout.math.VarLongWritable不是类org.apache.mahout.math.VectorWritable

时间:2016-05-11 09:21:05

标签: java hadoop mahout

当我使用mahout和Hadoop做一些推荐时,我遇到了一个问题。

错误按摩是:

    job.setMapperClass(FilesToItemPrefsMapper.class);
    job.setMapOutputKeyClass(VarLongWritable.class);
    job.setMapOutputValueClass(VarLongWritable.class);

    job.setReducerClass(FileToUserVectorReducer.class);
    job.setOutputKeyClass(VarLongWritable.class);
    job.setOutputValueClass(VectorWritable.class);

    job.setOutputFormatClass(SequenceFileOutputFormat.class);
    SequenceFileOutputFormat.setOutputCompressionType(job,CompressionType.NONE);

而且,主要功能是:

job.setInputFormatClass(TextInputFormat.class);

public void map(LongWritable key, Text value, Context context)
        throws IOException, InterruptedException {
        String line = value.toString();
        Matcher m = NUMBERS.matcher(line);
        m.find();
        VarLongWritable userID = new VarLongWritable(Long.parseLong(m.group()));
        VarLongWritable itemID = new VarLongWritable();
        while (m.find()){
            itemID.set(Long.parseLong(m.group()));
            context.write(userID, itemID);
        }

映射器是:

public class FileToUserVectorReducer 
        extends Reducer<VarLongWritable, VarLongWritable, VarLongWritable, VectorWritable> {
    public void reducer(VarLongWritable userID, Iterable<VarLongWritable> itemPrefs, Context context)
        throws IOException, InterruptedException{
        Vector userVector = new RandomAccessSparseVector(Integer.MAX_VALUE, 100);
        for(VarLongWritable itemPref : itemPrefs){
            userVector.set((int)itemPref.get(), 1.0f);
        }
        context.write(userID, new VectorWritable(userVector));
    }
}

减速器是:

(ValidationUtil\DcheckNullable)\(([a-zA-Z0-9]*)\)
                               ^^              ^^

我认为reducer的值是VectorWritable,它是在job.setOutputValueClass(VectorWritable.class)中设置的。如果是这样,为什么会发出这样的错误信息?

1 个答案:

答案 0 :(得分:0)

问题出在Reducer功能中。减速器(......)应该减少,这意味着:

public class FileToUserVectorReducer 
        extends Reducer<VarLongWritable, VarLongWritable, VarLongWritable, VectorWritable> {
    @Override
    public void reduce(VarLongWritable userID, Iterable<VarLongWritable> itemPrefs, Context context)
        throws IOException, InterruptedException{
        Vector userVector = new RandomAccessSparseVector(Integer.MAX_VALUE, 100);
        for(VarLongWritable itemPref : itemPrefs){
            userVector.set((int)itemPref.get(), 1.0f);
        }
        context.write(userID, new VectorWritable(userVector));
    }
}

@Override非常有帮助。如果我使用@Override,它会在编译时给出错误消息。我认为一开始没必要,但这种经验证明了它的价值。