ApplicationMaster:用户类引发异常:org.apache.spark.sql.AnalysisException:未找到表或视图:“ DB_X”。“ table_Y”
火花会话:
SparkSession
.builder()
.appName(appName)
.config("spark.sql.warehouse.dir", "/apps/hive/warehouse")
.enableHiveSupport()
.getOrCreate();
hive-site.xml中的Hive仓库目录:/ apps / hive / warehouse /
hadoop fs -ls /apps/hive/warehouse/
drwxrwxrwx - root hadoop 0 2018-09-03 11:22 /apps/hive/warehouse/DB_X.db
hadoop fs -ls /apps/hive/warehouse/DB_X.db
none
错误抛出在这里:
spark
.read()
.table("DB_X.table_Y");
在Java中:
spark.sql("show databases").show()
default
在星空互动中:
spark.sql("show databases").show()
default
DB_X
显示创建表table_Y:
CREATE EXTERNAL TABLE `table_Y`(
...
PARTITIONED BY (
`partition` string COMMENT '')
...
location '/data/kafka-connect/topics/table_Y'
hadoop文件:
hadoop fs -ls /data/kafka-connect/topics/table_Y
drwxr-xr-x - kafka hdfs 0 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0
drwxr-xr-x - kafka hdfs 0 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=1
hadoop fs -ls data/kafka-connect/topics/table_Y/partition=0
-rw-r--r-- 3 kafka hdfs 102388 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0/table_Y+0+0001823382+0001824381.avro
-rw-r--r-- 3 kafka hdfs 102147 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0/table_Y+0+0001824382+0001825381.avro
...
一切在火花壳或蜂巢壳中都能正常工作
来自hive conf的hive-site.xml复制到spark2 / conf
使用带有火花2.2的HDP 2.6.4.0-91
有什么帮助吗?
答案 0 :(得分:0)
使用HA名称重定位表即可解决问题。