hadoop中的ClassCastException

时间:2014-02-08 21:57:07

标签: java hadoop

当我启动mapreduce程序时,我收到此错误:

java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.BytesWritable
at nflow.hadoop.flow.analyzer.Calcul$Calcul_Mapper.map(Calcul.java:1)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)

映射器的代码:

public static class Calcul_Mapper extends Mapper<LongWritable, BytesWritable, Text, Text>{

    String delimiter="|";
    long interval = 60*60 ;

    Calendar cal;

    public void map(LongWritable key, BytesWritable value, Context context) throws IOException, InterruptedException {      

        byte[] value_bytes = value.getBytes();
        if(value_bytes.length < FlowWritable.MIN_PKT_SIZE + FlowWritable.PCAP_HLEN) return; 


        EZBytes eb = new EZBytes(value_bytes.length);
        eb.PutBytes(value_bytes, 0, value_bytes.length);

        // C2S key ==> protocol | srcIP | dstIP | sPort |dPort
        long sys_uptime = Bytes.toLong(eb.GetBytes(FlowWritable.PCAP_ETHER_IP_UDP_HLEN+4,4));
        long timestamp = Bytes.toLong(eb.GetBytes(FlowWritable.PCAP_ETHER_IP_UDP_HLEN+8,4))*1000000
            + Bytes.toLong(BinaryUtils.flipBO(eb.GetBytes(FlowWritable.PCAP_ETHER_IP_UDP_HLEN+12, 4),4));


        int count = eb.GetShort(FlowWritable.PCAP_ETHER_IP_UDP_HLEN+2);

        FlowWritable fw;
        byte[] fdata = new byte[FlowWritable.FLOW_LEN];
        int cnt_flows = 0;
        int pos = FlowWritable.PCAP_ETHER_IP_UDP_HLEN+FlowWritable.CFLOW_HLEN;

        try{
            while(cnt_flows++ < count){ 
                fw = new FlowWritable();
                fdata = eb.GetBytes(pos, FlowWritable.FLOW_LEN);

                if(fw.parse(sys_uptime, timestamp, fdata)){
                    context.write(new Text("Packet"), new Text(Integer.toString(1)));
                    context.write(new Text("Byte"), new Text(Integer.toString(1)));
                    context.write(new Text("Flow"), new Text(Integer.toString(1)));
                    context.write(new Text("srcPort"), new Text(Integer.toString(fw.getSrcport())));
                    context.write(new Text("dstPort"), new Text(Integer.toString(fw.getDstport())));                        
                    context.write(new Text("srcAddr"), new Text(fw.getSrcaddr()));
                    context.write(new Text("dstAddr"), new Text(fw.getDstaddr()));
                }else{

                }   
                pos += FlowWritable.FLOW_LEN;
            }           
        } catch (NumberFormatException e) {                           
        }
    }
}

有人知道什么是错的吗?

2 个答案:

答案 0 :(得分:0)

请检查您的工作配置吗?特别检查一下:

conf.setOutputKeyClass(Something.class);
conf.setOutputValueClass(Something.class);

顺便说一下,因为你的钥匙总是固定在一个常数上;你不需要为map函数中的每个发射创建它们。

如果您有一个将所有内容组合在一起的自定义密钥对象,我认为会好得多。为此,您需要扩展 ObjectWritable 并实现 WritableComparable

你的写作/发行对我来说非常可疑。

答案 1 :(得分:0)

您的作业是否接收来自普通文件的输入?如果是这样,您的输入值类型应该是Text而不是BytesWritable。

public static class Calcul_Mapper extends Mapper<LongWritable, Text, Text, Text>