使用MRUnit进行ORC Mapper单元测试

时间:2015-11-06 03:35:03

标签: hadoop mapreduce hadoop2 mrunit orc

我有一个处理ORC文件的地图程序。从驱动程序中我将orcformat设置为输入格式。

job.setInputFormatClass(OrcNewInputFormat.class); 

在OrcNewInputFormat中,值为OrcStruct。在Map方法中,可写值作为参数(值参数)传递,并且它是类型转换为地图内的OrcStruct,如下所示。

OrcStruct record = (OrcStruct) value

我想使用MRUnit测试此映射器。为此,在单元测试的设置方法中,我在testFilePath

中创建了一个ORC文件
 OrcFile.createWriter(testFilePath,  OrcFile.writerOptions(conf).inspector(inspector).stripeSize(100000).bufferSize(10000).version(OrcFile.Version.V_0_12));
writer.addRow(new SimpleStruct("k1", "v1")) ;

public static class SimpleStruct {
    Text k;
    Text string1;

    SimpleStruct(String b1, String s1) {
        this.k = new Text(b1);
        if (s1 == null) {
            this.string1 = null;
        } else {
            this.string1 = new Text(s1);
        }
    }
}

然后在测试方法中我读了它并使用MRUnit调用mapper。以下是代码

// Read orc file
Reader reader = OrcFile.createReader(fs, testFilePath) ;  
RecordReader recordRdr = reader.rows() ;
OrcStruct row = null ;
List<OrcStruct> mapData = new ArrayList<>()

while(recordRdr.hasNext()) { 
    row = (OrcStruct) recordRdr.next(row) ;
    mapData.add(row) ; 
}

// test mapper
initializeSerde(mapDriver.getConfiguration());

Writable writable = getWritable(mapData.get(0))  ; // test 1st record's mapper processing
mapDriver.withCacheFile(strCachePath).withInput(NullWritable.get(), writable );
mapDriver.runTest(); 

但是在运行测试用例时我得到了错误

java.lang.UnsupportedOperationException: can't write the bundle
at org.apache.hadoop.hive.ql.io.orc.OrcSerde$OrcSerdeRow.write(OrcSerde.java:61)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:98)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:82)
at org.apache.hadoop.mrunit.internal.io.Serialization.copy(Serialization.java:80)
at org.apache.hadoop.mrunit.internal.io.Serialization.copy(Serialization.java:97)
at org.apache.hadoop.mrunit.internal.io.Serialization.copyWithConf(Serialization.java:110)
at org.apache.hadoop.mrunit.TestDriver.copy(TestDriver.java:675)
at org.apache.hadoop.mrunit.TestDriver.copyPair(TestDriver.java:679)
at org.apache.hadoop.mrunit.MapDriverBase.addInput(MapDriverBase.java:120)
at org.apache.hadoop.mrunit.MapDriverBase.withInput(MapDriverBase.java:210)

查看orcserde我可以看到MRUnit不支持写入。因此测试用例错误了。

我们如何对处理Orc文件的映射器进行单元测试。在我正在做的事情中还有其他方式或需要改变的地方吗?

提前感谢您的帮助。

0 个答案:

没有答案