增量计数

时间:2018-06-20 11:10:06

标签: sql oracle lag window-functions lead

我有一个表,其中包含客户编号和订单日期的列表,并且想要对每个客户编号添加计数,每次客户编号更改时都从1重新开始,我已经将该表分类为客户,然后是日期订单,并且需要添加一个订单计数列。

CASE WHEN 'Customer Number' on This row = 'Customer Number' on Previous Row then ( Count = Count on Previous Row + 1 ) 
Else Count = 1

解决此问题的最佳方法是什么?

客户和客户中的日期,然后按日期顺序:

Customer    Date      Count
0001        01/05/18  1 
0001        02/05/18  2
0001        03/05/18  3
0002        03/05/18  1  <- back to one here as Customer changed
0002        04/05/18  2
0003        05/05/18  1  <- back to one again

我刚刚尝试过COUNT(*) OVER (PARTITION BY Customer ) as COUNT,但是当客户更改时,由于某种原因,它似乎不是从1开始的。

2 个答案:

答案 0 :(得分:1)

很难说出您想要什么,但是“ 在每个客户编号上添加一个计数,每次客户编号更改时都从1重新开始”听起来就像您只是想要:

count(*) over (partition by customer_number) 

或者也许应该是“直到”该行日期的计数:

count(*) over (partition by customer_number order by order_date) 

答案 1 :(得分:0)

听起来您只想an analytic row_number()打个电话:

select customer_number,
  order_date,
  row_number() over (partition by customer_number order by order_date) as num
from your_table
order by customer_number,
  order_date

也可以使用analytic count,因为建议使用@horse_with_no_name:

  count(*) over (partition by customer_number order by order_date) as num

同时显示这两个示例的快速演示,以及在CTE中的示例数据:

with your_table (customer_number, order_date) as (
            select '0001', date '2018-05-01' from dual
  union all select '0001', date '2018-05-03' from dual
  union all select '0001', date '2018-05-02' from dual
  union all select '0002', date '2018-05-03' from dual
  union all select '0002', date '2018-05-04' from dual
  union all select '0003', date '2018-05-05' from dual
)
select customer_number,
  order_date,
  row_number() over (partition by customer_number order by order_date) as num1,
  count(*) over (partition by customer_number order by order_date) as num2
from your_table
order by customer_number,
  order_date
/

CUST ORDER_DATE       NUM1       NUM2
---- ---------- ---------- ----------
0001 2018-05-01          1          1
0001 2018-05-02          2          2
0001 2018-05-03          3          3
0002 2018-05-03          1          1
0002 2018-05-04          2          2
0003 2018-05-05          1          1