计算hbase的平均温度时出错

时间:2013-10-29 15:34:06

标签: java hadoop mapreduce hbase

我想在hbase中计算表测试的平均温度(info:date,info:temp)并将结果放入表结果(info:date,info:avg)。 但是在运行程序时它给了我一个错误。

代码是:

public static class mapper1 extends TableMapper<Text,FloatWritable>
    {   
        public static final byte[] Info = "info".getBytes();
        public static final byte[] Date = "date".getBytes();
        public static final byte[] Temp = "temp".getBytes();
        private static Text key=new Text();


        public void map(ImmutableBytesWritable row,Result value,Context context)  
   throws IOException
             {         
            String k1 = new String(value.getValue(Info, Date));
            key.set(k1);
            byte[] val=value.getValue(Info,Temp);
        try
            {
                context.write(key,new 
   FloatWritable(Float.parseFloat(Bytes.toString(val))));
            }

         catch(InterruptedException e)
            {
                throw new IOException(e);
            }
             }}

// * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * *

    public static class reducer1 extends TableReducer<Text,Result,Text>
        {
        public static final byte[] info = "info".getBytes();
        public static final byte[] date = "date".getBytes(); 
        byte[] avg ;


     public void reduce(Text key,Iterable<FloatWritable>values, Context context)   
throws IOException, InterruptedException
            {
                float sum=0;
                int count=0;
                float average=0;
                for(FloatWritable val:values)
                {
                    sum+=val.get();
                    count++;
                }
                average=(sum/count);

                Put put = new Put(Bytes.toBytes(key.toString()));
                put.add(info, date, Bytes.toBytes(average));

                System.out.println("For\t"+count+"\t average is:"+average);

               context.write(key,put);
            }
        }

// * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * ** * **

    public static void main(String args[]) throws 
 IOException,ClassNotFoundException, InterruptedException, NullPointerException
        {
            Configuration config=HBaseConfiguration.create();
            config.set("hbase.zookeeper.quorum", "localhost");
            HTable table1 = new HTable(config, "test");
            HTable table2 = new HTable(config, "result");

            Job job=new Job(config,"AVG");
            Scan scan=new Scan();
            scan.addFamily("info".getBytes());
            scan.setFilter(new FirstKeyOnlyFilter());

            TableMapReduceUtil.initTableMapperJob(
                    "test",        
                    scan,              
                    mapper1.class,     
                    Text.class,         
                    FloatWritable.class,  
                    job);
            TableMapReduceUtil.initTableReducerJob(
                    "result",        
                    reducer1.class,    
                    job);
                job.setNumReduceTasks(1); 

                boolean b = job.waitForCompletion(true);
                if (!b) {
                    throw new IOException("error with job!");
                }
                 }
  }

错误消息是:

    Exception in thread "main" java.lang.NullPointerException
    at org.apache.hadoop.net.DNS.reverseDns(DNS.java:92)
     at  org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.reverseDNS(TableInputFormatBase.java:223)
    at 
org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:189)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:452)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:469)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:366)
    at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1218)
    at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1215)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at 
     org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1367)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1215)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1236)
    at TempVar.AVG.main(AVG.java:126)
你可以帮帮我吗?

1 个答案:

答案 0 :(得分:1)

看起来动物园管理员返回的计算机名称无法识别hbase 要么正确配置DNS,要么不使用,将映射表单名称添加到/ etc / hosts文件中的IP地址