无法在SPARK数据集

时间:2018-05-17 08:00:59

标签: apache-spark

以下是示例代码

spark-shell --master yarn 
val hive_location = "hive_meta_loc"
val spark =org.apache.spark.sql.SparkSession.builder().appName("WNGDEVICES").config("spark.sql.warehouse.dir", "hive_location").enableHiveSupport().getOrCreate()  
val rawdata = spark.sql("SELECT *  from <table name>") 
rawdata.createOrReplaceTempView("Rawdata")

当我执行此操作时,我收到以下错误

ERROR SNAP

0 个答案:

没有答案