java.lang.RuntimeException:java.lang.NoSuchMethodException:org.apache.hadoop.io.TwoDArrayWritable。<init>()</init>

时间:2014-12-26 18:04:08

标签: java hadoop multidimensional-array

我有一个map / reduce代码,它使用TwoDArrayWritable实现二维数组。当我尝试发出2d数组时,它给出了与初始化相关的异常。我搜索并发现它需要默认的构造函数,它没有提供。我如何为TwoDArrayWritable提供默认构造函数,或者还有其他一些我错了?请帮助我。

以下是映射器代码:

public class JaccardMapper extends Mapper<LongWritable, Text, IntTextPair, TwoDArrayWritable> {

    Hashtable movieInfo = new Hashtable<String, String>();
    String[] genres, actors, entities;
    String[] attributes = new String[] {"genre", "actors", "directors", "country", "year", "ratings"};
    double p,q,r,s;
    double result = 0.0;
    String input[] = null;
    Set<String> keys;

    TwoDArrayWritables array2d = new TwoDArrayWritables();
    //TwoDArrayWritable array2d = new TwoDArrayWritable(IntWritable.class);
    IntWritable[][] jaccard = new IntWritable[2][];
    //int[][] jaccard = new int[2][];


    public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException
    {
        p = 0;
        q = 0;
        r = 0;
        s = 0;

        /*
         * dataset format
         * 0 -> movieid
         * 1 -> title
         * 2 -> year
         * 3 -> actors
         * 4 -> directors
         * 5 -> genre
         * 6 -> country
         * 7 -> ratings
         * 8 -> cost
         * 9 -> revenue
         * */

        /*
         * input format
         * 0 -> genre
         * 1 -> actors
         * 2 -> directors
         * 3 -> country
         * 4 -> year
         * 5 -> ratings
         * */

        /*
         * (q + r) / (p + q + r) 
         * p -> number of variables positive for both objects 
         * q -> number of variables positive for the ith objects and negative for jth objects
         * r -> number of variables negative for the ith objects and positive for jth objects
         * s -> number of variables negative for both objects
         * */


        input = value.toString().toLowerCase().split(",");
        keys = movieInfo.keySet();


        //the jaccards 2d array column length depends on the user input best case is 6 but the worst case depends on the sub attributes count like more than one actor/director/genre/country.  
        int columnlength = input[1].split("\\|").length + input[2].split("\\|").length + input[3].split("\\|").length + input[4].split("\\|").length + 2;
        jaccard = new IntWritable[2][columnlength];
        for (int i = 0; i < jaccard.length; i++)
        {
            for (int j = 0; j < jaccard[i].length; j++)
            {
                jaccard[i][j] = new IntWritable(0);
            }
        }

        if (input.length > 0)
        {
            //iterate through the dataset in cache
            for(String keyy : keys)
            {
                //iterate to user's input attributes
                for (int attribute = 1; attribute < attributes.length; attribute++)
                {
                    if (!input[attribute].equals("-")) 
                    {
                        entities = input[attribute].toLowerCase().split("\\|");
                        int subattributecount = 0;

                        for(String entity : entities)
                        {
                                if (movieInfo.get(keyy).toString().toLowerCase().contains(entity))
                                {
                                    //if user criteria match with the data set, mark 1, 1
                                    jaccard[0][attribute + subattributecount] = new IntWritable(1);
                                    jaccard[1][attribute + subattributecount] = new IntWritable(1);
                                }
                                else
                                {
                                    //if user criteria doesn't match with the data set, mark 1, 0
                                    jaccard[0][attribute + subattributecount] = new IntWritable(1);
                                    jaccard[1][attribute + subattributecount] = new IntWritable(0);
                                }
                                subattributecount += 1;
                        }
                    }
                }
                IntTextPair pair = new IntTextPair(Integer.parseInt(input[0].toString()), movieInfo.get(keyy).toString());

                array2d.set(jaccard);
                //context.write(pair, array2d);
                context.write(pair, array2d);
            }


        }

}
}

这是TwoDArrayWritables包装类: import org.apache.hadoop.io.TwoDArrayWritable;

public class TwoDArrayWritables extends TwoDArrayWritable
{
    public TwoDArrayWritables() {
        super(TwoDArrayWritable.class);

    }


    public TwoDArrayWritables(Class valueClass) {
        super(valueClass);
        // TODO Auto-generated constructor stub
    }

}

以下是例外:

14/12/26 16:15:32 INFO mapreduce.Job: Task Id : attempt_1419259182533_0112_r_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.NoSuchMethodException: org.apache.hadoop.io.TwoDArrayWritable.<init>()
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
        at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:66)
        at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
        at org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKeyValue(ReduceContextImpl.java:146)
        at org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKey(ReduceContextImpl.java:121)
        at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.nextKey(WrappedReducer.java:307)
        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:170)
        at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.NoSuchMethodException: org.apache.hadoop.io.TwoDArrayWritable.<init>()
        at java.lang.Class.getConstructor0(Class.java:2892)
        at java.lang.Class.getDeclaredConstructor(Class.java:2058)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:125)
        ... 13 more

0 个答案:

没有答案