从cassandra加载数据时,Spark无法转换LocalDate

时间:2017-09-11 06:57:16

标签: java apache-spark spark-cassandra-connector

从cassandra加载数据时,Spark会抛出    java.lang.IllegalArgumentException,因为表中的列是LocalDate类型

以下是我的存储库代码: -

private transient Sparkconf sparkConf;
private SparkContext sc;

public List<Person> findAll() {
    sparkConf=new SparkConf();
    sparkConf.setAppName("investhry");
    sparkConf.setMaster("local[4]");
    sparkConf.set("spark.cassandra.connection.host", "localhost");
    sc=new JavaSparkContext(sparkConf);
    JavaRDD<Person> personJavaRDD=javaFunctions(sc).cassandraTable("java_api","person",mapRowTo(Person.class));
    List<Person> people=personJavaRDD.collect();
    sc.stop();
    return people;
}

以下是我的错误: -

java.lang.IllegalArgumentException: Unsupported type: java.time.LocalDate
at com.datastax.spark.connector.types.TypeConverter$.forCollectionType(TypeConverter.scala:930)
at com.datastax.spark.connector.types.TypeConverter$.forType(TypeConverter.scala:943)
at com.datastax.spark.connector.types.TypeConverter$.forType(TypeConverter.scala:962)
at com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter.converter(GettableDataToMappedTypeConverter.scala:108)
at com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter.com$datastax$spark$connector$rdd$reader$GettableDataToMappedTypeConverter$$converter(GettableDataToMappedTypeConverter.scala:117)
at com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter$$anonfun$7.apply(GettableDataToMappedTypeConverter.scala:184)
at com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter$$anonfun$7.apply(GettableDataToMappedTypeConverter.scala:181)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.MapLike$DefaultKeySet.foreach(MapLike.scala:174)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractSet.scala$collection$SetLike$$super$map(Set.scala:47)
at scala.collection.SetLike$class.map(SetLike.scala:93)
at scala.collection.AbstractSet.map(Set.scala:47)
at com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter.<init>(GettableDataToMappedTypeConverter.scala:181)
at com.datastax.spark.connector.rdd.reader.ClassBasedRowReader.<init>(ClassBasedRowReader.scala:21)
at com.datastax.spark.connector.rdd.reader.ClassBasedRowReaderFactory.rowReader(ClassBasedRowReader.scala:44)
at com.datastax.spark.connector.rdd.reader.ClassBasedRowReaderFactory.rowReader(ClassBasedRowReader.scala:39)
at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.rowReader(CassandraTableRowReaderProvider.scala:48)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.rowReader$lzycompute(CassandraTableScanRDD.scala:62)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.rowReader(CassandraTableScanRDD.scala:62)
at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.verify(CassandraTableRowReaderProvider.scala:138)

1 个答案:

答案 0 :(得分:0)

尝试在Person类中使用java.util.Date而不是LocalDate,这样可以解决您的问题。