Spark-Cassandra-connector问题:值write不是BOresultDf.write中Unit的成员

时间:2018-06-11 11:41:33

标签: scala apache-spark cassandra spark-cassandra-connector

我正在尝试使用以下代码将数据帧写入Cassandra:

val BackOfficeDF = spark.read
      .format("com.databricks.spark.avro")
      .load("/home/user/avro/data/output/backoffice/test.avro")

val uuid = udf(() => java.util.UUID.randomUUID().toString)

val BOresultDf = FilteredPSPDF.join(FilteredBackOfficeDF, FilteredBackOfficeDF("ProviderReference") === FilteredPSPDF("orderid"),"full_outer")
      .select( "TransactionId","BOTransactionType","PSPId", "UserId","Status","BOCurrency","BOAmount","Date","ProviderReference")
      .filter(FilteredPSPDF("orderid").isNull)
      .withColumnRenamed("BOTransactionType", "transaction_type")
      .withColumnRenamed("BOCurrency", "currency")
      .withColumnRenamed("BOAmount", "amount")
      .withColumnRenamed("PSPId", "psp_id")
      .withColumnRenamed("ProviderReference", "provider_reference")
      .withColumnRenamed("TransactionId", "transaction_id")
      .withColumnRenamed("UserId", "user_id")
      .withColumnRenamed("Status", "status")
      .withColumnRenamed("Date", "date")
      .withColumn("backoffice_payment_id", uuid())
      .show()

    BOresultDf.write
          .format("org.apache.spark.sql.cassandra")
          .mode("append")
          .option("keyspace","payments")
          .option("table","backoffice_payments")
          .save()

我在SBT中使用以下依赖项:

resolvers += "Spark Packages Repo" at "https://dl.bintray.com/spark-packages/maven"
libraryDependencies += "datastax" % "spark-cassandra-connector" % "2.0.1-s_2.11"

Scala版本是:2.11.8

但我收到错误:

Error:(80, 16) value write is not a member of Unit
    BOresultDf.write

我做错了什么?

0 个答案:

没有答案