如何动态地使用方法连接

时间:2017-12-05 22:17:32

标签: sql oracle subquery connect-by sql-subselect

如何实现以下

我有一个问题:

select sr.source_rel_id,
       sr.source_rel_start,
       sr.source_rel_end,
       sr.source_rel_end-sr.source_rel_start+ 1 as daycount
 from (SELECT RELEASES.REL_ID as source_rel_id,
              RELEASES.REL_START_DATE as source_rel_start,
              RELEASES.REL_END_DATE as source_rel_end
         FROM RELEASES) sr

提供了这个结果:

enter image description here

我正在尝试创建一个额外的coloumn(基于附加的屏幕截图),在给定的开始和结束间隔之间填充递增的日期。

这就是我想要完成的事情:

enter image description here

右侧的数字显示应为每个ID生成多少条记录。

抱歉我的英语不好,我希望这是可以理解的。

2 个答案:

答案 0 :(得分:2)

这基本上是一个生成行的连接查询,例如生成从1到10的行使用:

select level as x
from dual connect by level <= 10;

|  X |
|----|
|  1 |
|  2 |
|  3 |
|  4 |
|  5 |
|  6 |
|  7 |
|  8 |
|  9 |
| 10 |

为了生成从5到10的行,请使用:

select 5 + level - 1 as x
from dual connect by level <= 10 - 5 + 1;

|  X |
|----|
|  5 |
|  6 |
|  7 |
|  8 |
|  9 |
| 10 |

要生成从2017-01-022017-01-05的日期,请使用:

select date '2017-01-02' + level - 1 as x
from dual connect by level <= date '2017-01-05' - date '2017-01-02' + 1;

|                    X |
|----------------------|
| 2017-01-02T00:00:00Z |
| 2017-01-03T00:00:00Z |
| 2017-01-04T00:00:00Z |
| 2017-01-05T00:00:00Z |

如果您使用的是Oracle 12c,那么您可以使用LATERAL或CROSSAPPLY从源子查询或表中运行后一个生成器查询,形成许多起始和结束值,请考虑以下示例:

create table probe(
   source_rel_id int,
   source_rel_start date,
   source_rel_end date
);

insert into probe values( 1, date '2017-01-02', date '2017-01-05' );
insert into probe values( 2, date '2017-03-01', date '2017-03-15' );
insert into probe values( 3, date '2017-05-05', date '2017-05-30' );
commit;
select * from probe p
cross apply (
    select p.source_rel_start + level - 1 as my_date
    from dual connect by level <= p.source_rel_end - p.source_rel_start + 1
)

SOURCE_REL_ID SOURCE_REL_START SOURCE_REL_END   MY_DATE         
------------- ---------------- ---------------- ----------------
            1 2017/01/02 00:00 2017/01/05 00:00 2017/01/02 00:00
            1 2017/01/02 00:00 2017/01/05 00:00 2017/01/03 00:00
            1 2017/01/02 00:00 2017/01/05 00:00 2017/01/04 00:00
            1 2017/01/02 00:00 2017/01/05 00:00 2017/01/05 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/01 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/02 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/03 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/04 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/05 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/06 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/07 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/08 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/09 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/10 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/11 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/12 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/13 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/14 00:00
            2 2017/03/01 00:00 2017/03/15 00:00 2017/03/15 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/05 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/06 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/07 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/08 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/09 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/10 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/11 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/12 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/13 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/14 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/15 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/16 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/17 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/18 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/19 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/20 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/21 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/22 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/23 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/24 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/25 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/26 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/27 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/28 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/29 00:00
            3 2017/05/05 00:00 2017/05/30 00:00 2017/05/30 00:00

修改

  

如果我没有12c,我还有其他选择吗?我应该检查一下   版本,但我记得11。

只需生成一系列从1到N的数字,其中N必须大于您要为一条记录生成的最大数字数,这样:

  SELECT level as xyz FROM dual
  CONNECT BY LEVEL <= 10000

然后以这种方式将上述查询的结果加入到表中 - &gt;请参阅this demo

SELECT source_rel_id, source_rel_start, source_rel_end,
       source_rel_start + xyz - 1 As days
FROM (
  SELECT level as xyz FROM dual
  CONNECT BY LEVEL <= 10000
) x
JOIN probe p
ON xyz <= source_rel_end - source_rel_start + 1
ORDER BY source_rel_id, days

| SOURCE_REL_ID |     SOURCE_REL_START |       SOURCE_REL_END |                 DAYS |
|---------------|----------------------|----------------------|----------------------|
|             1 | 2017-01-02T00:00:00Z | 2017-01-05T00:00:00Z | 2017-01-02T00:00:00Z |
|             1 | 2017-01-02T00:00:00Z | 2017-01-05T00:00:00Z | 2017-01-03T00:00:00Z |
|             1 | 2017-01-02T00:00:00Z | 2017-01-05T00:00:00Z | 2017-01-04T00:00:00Z |
|             1 | 2017-01-02T00:00:00Z | 2017-01-05T00:00:00Z | 2017-01-05T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-01T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-02T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-03T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-04T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-05T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-06T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-07T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-08T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-09T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-10T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-11T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-12T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-13T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-14T00:00:00Z |
|             2 | 2017-03-01T00:00:00Z | 2017-03-15T00:00:00Z | 2017-03-15T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-05T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-06T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-07T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-08T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-09T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-10T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-11T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-12T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-13T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-14T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-15T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-16T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-17T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-18T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-19T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-20T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-21T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-22T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-23T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-24T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-25T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-26T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-27T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-28T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-29T00:00:00Z |
|             3 | 2017-05-05T00:00:00Z | 2017-05-30T00:00:00Z | 2017-05-30T00:00:00Z |

我已经使用了10000,它应该足够10000天超过27年,所以我认为你不需要为每个记录生成27年的日期,但如果这仍然太少,那么在实践中它可以是一个在100万到1000万之间的数字。由于行是在内存中生成的,因此在某些时候(1~10百万)会出现内存不足的错误 您还可以使用子查询来精确计算此限制:

SELECT level as xyz FROM dual
CONNECT BY LEVEL <= (
     SELECT max(  source_rel_end - source_rel_start ) + 1
     FROM probe
)

答案 1 :(得分:0)

如果您仍然使用Oracle 11g(在评论中建议),您可以使用connect by分层查询执行此操作,但由于您从多个源行获取数据,因此需要包含对一个非确定性的函数(我使用了`dbms_random.value,但其他函数可用......)来防止:

select rel_id as source_rel_id,
  rel_start_date + level - 1 as days,
  rel_end_date - rel_start_date + 1 as daycount
from releases
connect by rel_id  = prior rel_id
and prior dbms_random.value is not null
and level <= rel_end_date - rel_start_date + 1;

只使用样本数据的前三行,结果集中有92行:

SOURCE_REL_ID DAYS         DAYCOUNT
------------- ---------- ----------
           68 2016-03-01         31
           68 2016-03-02         31
           68 2016-03-03         31
...
           68 2016-03-30         31
           68 2016-03-31         31
           71 2016-06-01         30
           71 2016-06-02         30
...
           71 2016-06-29         30
           71 2016-06-30         30
           73 2016-08-01         31
           73 2016-08-02         31
...
           73 2016-08-30         31
           73 2016-08-31         31

如果您使用的是11gR2(或更高),您也可以使用recursive subquery factoring a.k.a进行递归CTE:

with rcte (source_rel_id, rel_end_date, days, daycount) as (
  select rel_id, rel_end_date, rel_start_date, rel_end_date - rel_start_date + 1
  from releases
  union all
  select source_rel_id, rel_end_date, days + 1, daycount
  from rcte
  where days < rel_end_date
)
select source_rel_id, days, daycount
from rcte
order by source_rel_id, days;

获得相同的结果 - 除了明确排序,这可能不是一件坏事。

CTE的锚点分支从您的基表中获取数据,并添加两个额外的列:daycount,其计算方式与之前一样,days与第一个集合的开始日期相同的行。 rel_end_date需要可用于递归分支,以便包括在内,但可以跳过rel_start_date

递归分支然后将一天添加到days,直到它到达结束日期。如果上一个生成的行中的days值已经是rel_end_date,则不满足条件,因此它会停止,对于该源行。

最后,您只查询要从CTE中查看的列 - 在这种情况下,只是排除传递的rel_end_date以提供停止条件,但您实际上并不想查看。