Spark 2.0 - 查询Hive时“找不到表或视图”

时间:2016-08-13 09:31:12

标签: hadoop apache-spark hive apache-spark-sql

通过spark-shell 2.0:

查询Hive时
spark.sql("SELECT * FROM schemaname.tablename")

它会抛出错误:

16/08/13 09:24:17 INFO execution.SparkSqlParser: Parsing command: SELECT * FROM schemaname.tablename
org.apache.spark.sql.AnalysisException: Table or view not found: `schemaname`.`tablename`; line 1 pos 14
...

Hive访问似乎是通过hive-site.xml正确配置的。在shell中,Spark正在打印:

scala> spark.conf.get("spark.sql.warehouse.dir")
res5: String = /user/hive/warehouse

conf/hive-site.xml内部,配置了Hive,可以在Spark上访问其配置。列出数据库时,它显示现有的default数据库。但它没有显示default内的表格。

scala> spark.catalog.listDatabases.show(false)
+-------+----------------+---------------------------------------------+
|name   |description     |locationUri                                  |
+-------+----------------+---------------------------------------------+
|default|default database|hdfs://hdfs-server-uri:8020/user/hive/warehouse|
+-------+----------------+---------------------------------------------+

scala> spark.catalog.listTables("default").show()
+----+--------+-----------+---------+-----------+
|name|database|description|tableType|isTemporary|
+----+--------+-----------+---------+-----------+
+----+--------+-----------+---------+-----------+

访问Hive时可能会遗漏什么?

0 个答案:

没有答案