加载PostgreSQL表时,Spark 2中的SQL子查询失败

时间:2018-09-27 18:34:45

标签: postgresql apache-spark jdbc

当尝试通过子查询加载PostgreSQL表的一部分时,我面临一个非常烦人的PSQL问题。

查询是:

SELECT 
    N1,
    N2, 
    N3,
    N4
FROM CORR 
WHERE CORR_N5 >= (now() - interval '18 year') 
AND CORR_N5 <= (now() - interval '18 year' + interval '1 month')

如果直接用PgAdmin编写,则此方法有效。但是,当我从spark 2作业运行它时,出现以下错误消息:

org.postgresql.util.PSQLException: ERROR: subquery in FROM must have an alias
  Hint: For example, FROM (SELECT ...) [AS] foo.

即使我在所有子句后加上别名,也会发生相同的问题。

有什么建议吗?

预先感谢

1 个答案:

答案 0 :(得分:0)

Melvin,请看以下链接

https://pganalyze.com/docs/log-insights/app-errors/U115

subquery in FROM must have an alias

SELECT * FROM (
    SELECT N1, N2, N3, N4 
    FROM CORR WHERE COR_N5 >= (now() - interval '18 year') 
    AND CORR_N5 <= (now() - interval '18 year' + interval '1 month')
) AS input