如何使用Java Spark将文本文件转换为实木复合地板

时间:2018-08-28 17:36:22

标签: java apache-spark parquet

我正在尝试将文本文件转换为实木复合地板文件。我只能从其他文件格式或用scala / python编写的代码中找到“如何转换为实木复合地板”。 这是我想出的

import org.apache.parquet.schema.MessageType;
import org.apache.parquet.schema.MessageTypeParser;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import org.apache.spark.sql.types.*;

private static final StructField[] fields = new StructField[]{
            new StructField("timeCreate", DataTypes.StringType, false, Metadata.empty()),
            new StructField("cookieCreate", DataTypes.StringType, false,Metadata.empty())
};//simplified
private static final StructType schema = new StructType(fields);

public static void main(String[] args) throws IOException {
    SparkSession spark = SparkSession
            .builder().master("spark://levanhuong:7077")
            .appName("Convert text file to Parquet")
            .getOrCreate();
    spark.conf().set("spark.executor.memory", "1G");
    WriteParquet(spark, args);

}
public static void WriteParquet(SparkSession spark, String[] args){
    JavaRDD<String> data = spark.read().textFile(args[0]).toJavaRDD();
    JavaRDD<Row> output = data.map((Function<String, Row>) s -> {
        DataModel model = new DataModel(s);
        return RowFactory.create(model);
    });

    Dataset<Row> df = spark.createDataFrame(output.rdd(),schema);
    df.printSchema();
    df.show(2);
    df.write().parquet(args[1]);
}

args[0]是输入文件的路径,args[1]是输出文件的路径。这是简化的DataModel。 DateTime字段在set()函数中的格式正确

public class DataModel implements Serializable {
DateTime timeCreate;
DateTime cookieCreate;

public DataModel(String data){
    String model[] = data.split("\t");
    setTimeCreate(model[0]);
    setCookieCreate(model[1]);
}

这是错误。错误日志指向df.show(2),但我认为该错误是由map()引起的。我不确定为什么,因为我在代码中看不到任何强制转换

    >java.lang.ClassCastException: cannot assign instance of

 java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.fun$1 
    of type org.apache.spark.api.java.function.Function in instance 

    of org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1

我认为这足以重现该错误,请告诉我是否需要提供更多信息。

1 个答案:

答案 0 :(得分:0)

可以使用其他方法,效果很好:

    JavaRDD<String> data = spark().read().textFile(args[0]).toJavaRDD();
    JavaRDD<DataModel> output = data.map(s -> {
        String[] parts = s.split("\t");
        return new DataModel(parts[0], parts[1]);
    });
    Dataset<Row> result = spark().createDataFrame(output, DataModel.class);

“ DataModel”类最好看起来像是简单的TO,没有功能:

public class DataModel implements Serializable {
private final String timeCreate;
private final String cookieCreate;

public DataModel(String timeCreate, String cookieCreate) {
    this.timeCreate = timeCreate;
    this.cookieCreate = cookieCreate;
}

public String getTimeCreate() {
    return timeCreate;
}

public String getCookieCreate() {
    return cookieCreate;
}

}