如何在Spark的Cassandra查询中使用java.time.LocalDate?

时间:2016-10-13 12:54:22

标签: apache-spark cassandra apache-spark-sql spark-cassandra-connector

我们在Cassandra有一个表start_timedate

执行以下代码时:

val resultRDD = inputRDD.joinWithCassandraTable(KEY_SPACE,TABLE)
   .where("start_time = ?", java.time.LocalDate.now)

我们收到以下错误:

com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-10-13 of type class java.time.LocalDate to com.datastax.driver.core.LocalDate.
at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45)
at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43)
at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$14.applyOrElse(TypeConverter.scala:449)
at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43)
at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:439)
at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56)
at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:439)
at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$29.applyOrElse(TypeConverter.scala:788)
at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43)
at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:771)
at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56)
at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:771)
at com.datastax.spark.connector.writer.BoundStatementBuilder$$anonfun$8.apply(BoundStatementBuilder.scala:93)

我尝试根据documentation注册自定义转换器:

object JavaLocalDateToCassandraLocalDateConverter extends TypeConverter[com.datastax.driver.core.LocalDate] {
  def targetTypeTag = typeTag[com.datastax.driver.core.LocalDate]
  def convertPF = { 
      case ld: java.time.LocalDate => com.datastax.driver.core.LocalDate.fromYearMonthDay(ld.getYear, ld.getMonthValue, ld.getDayOfMonth) 
      case _ => com.datastax.driver.core.LocalDate.fromYearMonthDay(1971, 1, 1) 
  }
}

object CassandraLocalDateToJavaLocalDateConverter extends TypeConverter[java.time.LocalDate] {
  def targetTypeTag = typeTag[java.time.LocalDate]
  def convertPF = { case ld: com.datastax.driver.core.LocalDate => java.time.LocalDate.of(ld.getYear(), ld.getMonth(), ld.getDay()) 
                    case _ => java.time.LocalDate.now 
  }
}

TypeConverter.registerConverter(JavaLocalDateToCassandraLocalDateConverter)
TypeConverter.registerConverter(CassandraLocalDateToJavaLocalDateConverter)

但它没有帮助。

如何在从Spark执行的Cassandra查询中使用JDK8日期/时间类?

2 个答案:

答案 0 :(得分:2)

我认为在这样的where子句中最简单的方法就是调用

sc
 .cassandraTable("test","test")
 .where("start_time = ?", java.time.LocalDate.now.toString)
 .collect`

然后传入字符串,因为那将是一个定义良好的转换。

TypeConverters中似乎存在一个问题,即转换器没有优先于内置转换器。我快点看看。

- 编辑 -

似乎已注册的转换器未正确传输到执行程序。在本地模式下,代码按预期工作,这使我认为这是一个序列化问题。我会在Spark Cassandra Connector上为这个问题打开一张票。

答案 1 :(得分:0)

Cassandra日期格式为yyyy-MM-dd HH:mm:ss.SSS

所以你可以使用下面的代码,如果你使用Java 8将Cassandra日期转换为LocalDate,那么你可以做你的逻辑。

val formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss.SSS")
val dateTime = LocalDateTime.parse(cassandraDateTime, formatter);

或者您可以将LocalDate转换为Cassandra日期格式并进行检查。