NoSuchElementException:将数据集保存到cassandra中时,表中找不到列

时间:2019-09-04 15:01:58

标签: apache-spark cassandra datastax datastax-java-driver

我在PoC中使用spark-cassandra-connector_2.11- 2.4.1和spark-sql 2.4.1。

当我填充VO数据并使用df.write.options .... save()保存时,会出现NoSuchElementException错误

以下是详细信息:

  

我已经定义了cassandra表

CREATE TABLE company(
    company_id int,
    company_name text,
    trans_date date,
    last_update_date date,
    PRIMARY KEY (company_id, trans_date)
) WITH CLUSTERING ORDER BY ( trans_date DESC );


import com.datastax.driver.mapping.annotations.ClusteringColumn;
import com.datastax.driver.mapping.annotations.Column;
import com.datastax.driver.mapping.annotations.PartitionKey;
import com.datastax.driver.mapping.annotations.Table;

@Table(name = "company")
public class CompanyRecord implements Serializable{
        private static final long serialVersionUID = 1L;

        @PartitionKey(0)
        @Column(name="company_id")
        private Integer companyId;

        @Column(name="company_name")
        private String companyName;

        @ClusteringColumn(0)
        @Column(name="trans_date")
        private LocalDate transDate;

        @Column(name="last_update_date")
        private LocalDate lastUpdateDate;

        //constructors and setter and getters here. 

  }

我试图保存我的

  

java.util.NoSuchElementException:在表公司中找不到列:   companyId,companyName,lastUpdateDate,transDate位于   com.datastax.spark.connector.SomeColumns.selectFrom(ColumnSelector.scala:44)

我在这里做错了什么?为什么不将批注名称映射到表列名称?如何解决此问题?

第二个问题 现在出现错误:

com.datastax.spark.connector.types.TypeConversionException: Cannot convert object [] of type class org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema to com.datastax.driver.core.LocalDate.
    at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43)
    at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$23.applyOrElse(TypeConverter.scala:426)

0 个答案:

没有答案