断言失败:没有针对MetastoreRelation ext_sub_cust_profile

时间:2019-02-13 12:44:20

标签: apache-spark apache-spark-sql spark-streaming

在Spark Streaming中加入2个DF时出现以下问题。您能帮我解决这个问题吗?

断言失败:没有针对MetastoreRelation test_db ext_sub_cust_profile的计划。

  val custPomerDF: DataFrame = hiveContext.sql("select rowkey, mkt_opto_flag, thrd_party_opto_flag from test_db.sub_cust_profile")
val custProfileDF: DataFrame = broadcast(custPomerDF.as( "custProfile"))

messages.foreachRDD { rdd =>
  val decodedMessages: RDD[SubscriberDetails] = decodeMessage(rdd).filter(_.key != null)

  val sqlContext = new SQLContext(sparkContext)
  import sqlContext.implicits._

  val decodeMessagesDF: DataFrame = decodedMessages.toDF()
  //val enrichedDF: DataFrame = decodeMessagesDF.join(broadcast(custProfileDF), (decodeMessagesDF( "key") === custProfileDF("rowkey")) && (custProfileDF("mkt_opto_flag") === "N") && (custProfileDF("thrd_party_opto_flag") === "N"))

  val enrichedDF: DataFrame = decodeMessagesDF.join(broadcast(custProfileDF), $"key" === $"rowkey", "inner")

 decodeMessagesDF.write
    .format("kafka")
    .option("kafka.bootstrap.servers", kafkaBrokers)
    .option("topic", outputTopic)
    .save() 

}

sub_cust_profile是从ext_sub_cust_profile创建的视图。谢谢。

0 个答案:

没有答案