无法使用pyspark从配置单元表查询复杂的SQL语句

时间:2019-07-08 15:13:15

标签: apache-spark hive pyspark apache-spark-sql

嗨,我正在尝试从Spark上下文查询配置单元表。

我的代码:

from pyspark.sql import HiveContext

hive_context = HiveContext(sc)
bank = hive_context.table('select * from db.table_name')
bank.show()

像这样的简单查询可以正常工作,没有任何错误。 但是当我尝试以下查询时。

query = """with table1         as  (   select      distinct a,b
                            from    db_first.table_first
                            order by b )
--select * from table1 order by b
,c      as  (   select      * 
                            from    db_first.table_two)
--select * from c 
,d      as  (   select      *
                            from    c
                            where   upper(e) = 'Y')
--select * from d 
,f            as  (   select      table1.b
                                       ,cast(regexp_extract(g,'(\\d+)-(A|B)- 
   (\\d+)(.*)',1) as Int) aid1
                                    ,regexp_extract(g,'(\\d+)-(A|B)- 
    (\\d+)(.*)',2) aid2
                                    ,cast(regexp_extract(g,'(\\d+)-(A|B)- 
   (\\d+)(.*)',3) as Int) aid3

,from_unixtime(cast(substr(lastdbupdatedts,1,10) as int),"yyyy-MM-dd 
HH:mm:ss") lastupdts
                                    ,d.*
                            from    d
                            left outer join table1
                                on          d.hiba = table1.a)
select * from f order by b,aid1,aid2,aid3 limit 100"""

我收到以下错误,请帮忙。

ParseExceptionTraceback (most recent call last)
<ipython-input-27-cedb6fad210d> in <module>()
      3 hive_context = HiveContext(sc)
      4 #bank = hive_context.table("bdalab.test_prodapt_inv")
----> 5 bank = hive_context.table(first)

ParseException: u"\nmismatched input '*' expecting <EOF>(line 1, pos 7)\n\n== SQL ==\nselect *

1 个答案:

答案 0 :(得分:1)

如果使用的是SQL查询,则需要使用 .sql 方法的代替 .table 方法的。< / p>

1.Using .table method then we need to provide table name:

>>> hive_context.table("<db_name>.<table_name>").show()

2.Using .sql method then provide your with cte expression:

>>> first ="with cte..."
>>> hive_context.sql(first).show()