PigLatin:java.lang.OutOfMemoryError:超出了GC开销限制

时间:2012-11-14 23:02:04

标签: mapreduce weka apache-pig

我对Pig Latin非常陌生,尝试通过我的作业来理解Pig Latin,我之前使用Map Reduce。而且我得到GC开销错误,PS:我的输入只是10行简单的csv文件。

我正在尝试将csv格式转换为arff。

我的UDF:

public class CSV2ARFF extends EvalFunc<String> {
private String arffDataString;
private String arffHeaderString;

public String exec(Tuple input) throws IOException {
    if (input == null || input.size() == 0)
        return null;
    try{
            System.out.println(">>> " + input.get(0).toString());
            // csv to instances
            ByteArrayInputStream inputStream = new ByteArrayInputStream(input.get(0).toString().getBytes("UTF-8"));
            CSVLoader loader = new CSVLoader();             
            loader.setSource(inputStream);
            Instances data = loader.getDataSet(); //**Line #30**
            //convert into arff
            ArffSaver arff = new ArffSaver();               
            arff.setInstances(data);                
            this.arffDataString = arff.getInstances().toString();               
            Instances arffdata = arff.getInstances();
            // header
            Instances header = new Instances(arffdata, 0);
            this.arffHeaderString = header.toString();
            this.arffDataString = this.arffDataString.substring(this.arffHeaderString.length());

            return arffDataString;

    }catch(Exception e){
        System.err.println("CSV2ARFF: failed to proces input; error - " + e.getMessage());
        return null;
    }
}

}

我的script.pig

REGISTER ./csv2arff.jar;
REGISTER ./weka.jar;

csvraw = LOAD 'sample' USING PigStorage('\n') as (c);

arffraws = FOREACH csvraw GENERATE pighw2java.CSV2ARFF(c);

--output

STORE arffraws INTO 'output' using PigStorage();

错误

java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.nio.CharBuffer.wrap(CharBuffer.java:369)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:310)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:177)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:154)
at java.io.BufferedReader.read(BufferedReader.java:175)
at java.io.StreamTokenizer.read(StreamTokenizer.java:500)
at java.io.StreamTokenizer.nextToken(StreamTokenizer.java:544)
at weka.core.converters.ConverterUtils.getToken(ConverterUtils.java:888)
at weka.core.converters.CSVLoader.readHeader(CSVLoader.java:937)
at weka.core.converters.CSVLoader.readStructure(CSVLoader.java:578)
at weka.core.converters.CSVLoader.getStructure(CSVLoader.java:563)
at weka.core.converters.CSVLoader.getDataSet(CSVLoader.java:596)
at pighw2java.CSV2ARFF.exec(CSV2ARFF.java:30)
at pighw2java.CSV2ARFF.exec(CSV2ARFF.java:1)

1 个答案:

答案 0 :(得分:0)

我遇到过类似的情况。 以本地模式运行猪会导致此错误(pig -x local)。 当我在Map reduce模式下运行相同的查询时,它就被解析了(pig)。

希望它有所帮助。