HiveContext:无法通过JDBC客户端查看临时表

时间:2016-04-13 10:19:52

标签: apache-spark hive pyspark-sql

  

在pyspark,

注册临时表

from pyspark import HiveContext
sqlContext = HiveContext(sc)
df = sqlContext.sql("select * from test").collect()
df.registerTempTable("testing")
sqlContext.sql("show tables").show()
+--------------------+-----------+
|           tableName|isTemporary|
+--------------------+-----------+
|             testing|       true|
|               check|      false|
+--------------------+-----------+

我可以查看临时表"测试"来自pyspark

我启动了spark thrift服务器

启动JDBC客户端并连接到spark thrift服务器,

$ ./bin/beeline
beeline> !connect jdbc:hive2://ip:10000
Connecting to jdbc:hive2://ip:10000
Enter username for jdbc:hive2://ip: 
Enter password for jdbc:hive2://ip:10000:
16/03/06 13:17:41 INFO jdbc.Utils: Supplied authorities: :10000
16/03/06 13:17:41 INFO jdbc.Utils: Resolved authority: :10000
16/03/06 13:17:41 INFO jdbc.HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://ip:10000
Connected to: Spark SQL (version 1.5.2)
Driver: Spark Project Core (version 1.5.2)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://ip.> show tables;
+-------------+--------------+--+
|  tableName  | isTemporary  |
+-------------+--------------+--+
| check       | false        |
+-------------+--------------+--+
2 rows selected (0.842 seconds)
0: jdbc:hive2://ip.>

我无法查看临时表。 有什么东西我不见了吗?

1 个答案:

答案 0 :(得分:2)

临时表只能在您当前的会话中存活。这意味着您通过beeline的新会话无法看到临时表testing