选择两个日期之间的所有日期,并显示每个日期的唯一库存ID计数

时间:2015-08-03 09:47:07

标签: oracle date

我正在使用oracle db。我有2个表专门的Inventory和grounding_info。每个库存可以有多个接地信息或没有。表结构如下。

**Inventory**
Inventory_id

**Grounding_info**
Info_id
Inventory_id
Grounding_date

我希望在给定日期范围之间的每个日期获得库存计数。如果没有基于日期的库存,则该日期应显示为计数0。

我尝试使用以下查询

SELECT tble.dte, 
       Count(groundinginfo.inventory_id) cmt 
FROM   atl_grounding_info groundinginfo 
       right join (SELECT To_date('2015/01/01 12:00:00 A.M.', 
                          'YYYY/MM/DD hh:mi:ss A.M.') - 1 + 
                                         ROWNUM AS 
                          dte 
                   FROM   all_objects groundinginfo 
                   WHERE  To_date('2015/01/01 12:00:00 A.M.', 
                          'YYYY/MM/DD hh:mi:ss A.M.') - 1 + 
                          ROWNUM <= 
                          To_date('2015/04/01 11:59:59 P.M.', 
                          'YYYY/MM/DD hh:mi:ss P.M.')) tble 
               ON groundinginfo.date_turned_in = tble.dte 
GROUP  BY tble.dte 

但计数始终显示为0

3 个答案:

答案 0 :(得分:1)

如果您确定自己有某些值,请使用TRUNC

进行尝试
SELECT tble.dte, 
       Count(groundinginfo.inventory_id) cmt 
FROM   atl_grounding_info groundinginfo 
       right join (SELECT To_date('2015/01/01 12:00:00 A.M.', 
                          'YYYY/MM/DD hh:mi:ss A.M.') - 1 + 
                                         ROWNUM AS 
                          dte 
                   FROM   all_objects groundinginfo 
                   WHERE  To_date('2015/01/01 12:00:00 A.M.', 
                          'YYYY/MM/DD hh:mi:ss A.M.') - 1 + 
                          ROWNUM <= 
                          To_date('2015/04/01 11:59:59 P.M.', 
                          'YYYY/MM/DD hh:mi:ss P.M.')) tble 
               ON TRUNC(groundinginfo.date_turned_in) = TRUNC(tble.dte)
GROUP  BY tble.dte ;

答案 1 :(得分:1)

with  dates as (
select trunc(sysdate - 100) /*begin date*/ + level d 
  from dual
 connect by level +  (trunc(sysdate - 100)) < sysdate  /*end date*/) 
select d,count(object_id) from dates 
left join  user_objects b on b.last_ddl_time >= d and b.last_ddl_time < (d+1)
group by d;

答案 2 :(得分:0)

这是一个可能有助于加快速度的替代方案 - 尽管如果你的表中有很多数据,我希望它需要一段时间。

with grounding_info as (select 1 info_id, 1 inventory_id, to_timestamp('01/08/2015 12:02:38.529343', 'dd/mm/yyyy hh24:mi:ss.ff6') grounding_date from dual union all
                        select 2 info_id, 2 inventory_id, to_timestamp('02/08/2015 15:12:15.123456', 'dd/mm/yyyy hh24:mi:ss.ff6') grounding_date from dual union all
                        select 3 info_id, 3 inventory_id, to_timestamp('04/08/2015 09:58:46.654321', 'dd/mm/yyyy hh24:mi:ss.ff6') grounding_date from dual union all
                        select 4 info_id, 4 inventory_id, to_timestamp('05/08/2015 11:43:29.394502', 'dd/mm/yyyy hh24:mi:ss.ff6') grounding_date from dual union all
                        select 5 info_id, 5 inventory_id, to_timestamp('05/08/2015 23:25:43.394502', 'dd/mm/yyyy hh24:mi:ss.ff6') grounding_date from dual),
              dates as (select dt start_of_day, dt + 1 start_of_next_day
                        from   (select to_date('31/07/2015', 'dd/mm/yyyy') -- first day; ideally these would be paramaterised!
                                         - 1 + level 
                                         as dt
                                from   dual
                                connect by level <= to_date('06/08/2015', 'dd/mm/yyyy') -- final day
                                                      - to_date('31/07/2015', 'dd/mm/yyyy') -- first day
                                                      + 1 -- required to include the first day in the list of dates
                                ))
select dts.start_of_day dt,
       count(gi.info_id) 
from   dates dts
       left outer join grounding_info gi on (gi.grounding_date >= dts.start_of_day and gi.grounding_date < dts.start_of_next_day)
group by dts.start_of_day
order by dts.start_of_day;

DT         COUNT(GI.INFO_ID)
---------- -----------------
31/07/2015                 0
01/08/2015                 1
02/08/2015                 1
03/08/2015                 0
04/08/2015                 1
05/08/2015                 2
06/08/2015                 0