SQL子查询:子查询有太多列

时间:2016-02-17 00:09:03

标签: sql postgresql

我正在尝试运行一个包含2个子查询的查询。我正试图创建一个名为“Delta Date'从Date1中减去minDate(来自第二个查询)。请帮助,我不断收到错误'子查询有太多列。'

SELECT Date1,                   #first query
     (Date1 - minDate) as Delta Date
    UNIQUE_ID 
FROM    panel
    WHERE (lower(criteria) LIKE lower(\'%criteria1%\'))
    AND UNIQUE_ID IN (


SELECT  min(Date1) as minDate,          #second query
    UNIQUE_ID 
FROM    panel
    WHERE (lower(criteria) LIKE lower(\'%criteria2%\'))
    AND Amount < 10000
    AND UNIQUE_ID IN ( SELECT   UNIQUE_ID        #third query
                           FROM     panel
                           WHERE    file_date > \'9/30/2015\'
/* AND additional logic to filter member purchases */
                           GROUP BY UNIQUE_ID
                           HAVING   count(AMOUNT) > 1 )

GROUP BY UNIQUE_ID )

2 个答案:

答案 0 :(得分:0)

您标记为“#second query”的子查询返回minDate&amp;唯一身份。这就是问题所在。

  

UNIQUE_ID IN(选择min(Date1)为minDate,UNIQUE_ID FROM ....

您需要将#second查询放在第一个查询的FROM子句中。

答案 1 :(得分:0)

//Try this code function hi(){ //without parenthesis it is treated as a variable alert('without ()\n\n'+hi); return 6; } //with parenthesis it is executed as a function alert('With () \n'+ hi()); 子查询在此上下文中应该只有一列:

in

您的逻辑非常难以理解,我不确定修复语法会创建一个符合您需要的查询。

我认为你可以使用窗口功能来达到类似效果。我不确定究竟是什么逻辑,但它看起来像这样:

UNIQUE_ID IN (
    SELECT UNIQUE_ID 
    FROM panel
    WHERE (lower(criteria) LIKE lower(\'%criteria1%\')) AND
           UNIQUE_ID IN (SELECT UNIQUE_ID 
                         FROM  panel
                         WHERE (lower(criteria) LIKE lower(\'%criteria2%\')) AND
                               Amount < 10000 AND
                               UNIQUE_ID IN (SELECT UNIQUE_ID 
                                             FROM  panel
                                             WHERE file_date > \'9/30/2015\'
    /* AND additional logic to filter member purchases */
                               GROUP BY UNIQUE_ID
                               HAVING   count(AMOUNT) > 1
                              )
                         GROUP BY UNIQUE_ID
                        )
    )