我在SQL Server数据库中有一个表,其中包含每次用户从我的应用程序下载图像时的注册表,因此我的表TBL_Downloads
具有以下结构:
UserID| ImageID | DownloadDate |
------+-----------+---------------------------+
292 | 782 | 02-01-2016 14:20:22.737 |
292 | 783 | 02-01-2016 14:20:22.737 |
292 | 784 | 02-02-2016 14:20:22.737 |
292 | 785 | 02-04-2016 14:20:22.737 |
292 | 786 | 02-05-2016 14:20:22.737 |
292 | 787 | 02-06-2016 14:20:22.737 |
在表中,仅显示1个特定用户的寄存器,即使有几个只是为了简化示例。
我想要的是一个结果表,其中包含特定用户在过去30天内的BY DAY下载次数,包括没有下载的日期的零。我目前有以下查询:
SELECT COUNT(*) AS Downloads
FROM TBL_Downloads
WHERE DownloadDate BETWEEN DATEADD(day, -30, GETDATE()) AND GETDATE()
AND IdUser = 292
GROUP BY CAST(DownloadDate AS DATE)
这将返回一个包含总和的表,但仅限于至少有一个下载条目的天数。
您有什么想法我能解决这个问题吗?
答案 0 :(得分:2)
您可以使用日历或日期表进行此类操作。
对于内存中只有152kb,您可以在表格中拥有30年的日期:
/* dates table */
declare @fromdate date = '20000101';
declare @years int = 30;
/* 30 years, 19 used data pages ~152kb in memory, ~264kb on disk */
;with n as (select n from (values(0),(1),(2),(3),(4),(5),(6),(7),(8),(9)) t(n))
select top (datediff(day, @fromdate,dateadd(year,@years,@fromdate)))
[Date]=convert(date,dateadd(day,row_number() over(order by (select 1))-1,@fromdate))
into dbo.Dates
from n as deka cross join n as hecto cross join n as kilo
cross join n as tenK cross join n as hundredK
order by [Date];
create unique clustered index ix_dbo_Dates_date
on dbo.Dates([Date]);
如果不采取创建表格的实际步骤,您可以在common table expression内使用它:
declare @fromdate date = dateadd(day , datediff(day , 0, getdate() )-30 , 0);
declare @thrudate date = dateadd(day , datediff(day , 0, getdate() ), 0);
;with n as (select n from (values(0),(1),(2),(3),(4),(5),(6),(7),(8),(9)) t(n))
, dates as (
select top (datediff(day, @fromdate, @thrudate)+1)
[Date]=convert(date,dateadd(day,row_number() over(order by (select 1))-1,@fromdate))
from n as deka cross join n as hecto cross join n as kilo
cross join n as tenK cross join n as hundredK
order by [Date]
)
select [Date]
from dates;
使用如下:
select
d.Date
, count(t.DownloadDate) as DownloadCount
from dates d
left join TBL_Downloads t
on d.date = convert(date,t.DownloadDate)
and t.userid = 292
where d.date >= dateadd(day , datediff(day , 0, getdate() )-30 , 0)
and d.date <= dateadd(day , datediff(day , 0, getdate() ), 0)
group by d.date
rextester 演示:http://rextester.com/ISK37732(日期更改为过去30天内)
返回:
+------------+---------------+
| Date | DownloadCount |
+------------+---------------+
| 2017-02-27 | 0 |
| 2017-02-28 | 0 |
| 2017-03-01 | 2 |
| 2017-03-02 | 1 |
| 2017-03-03 | 0 |
| 2017-03-04 | 1 |
| 2017-03-05 | 1 |
| 2017-03-06 | 1 |
| 2017-03-07 | 0 |
| 2017-03-08 | 0 |
| 2017-03-09 | 0 |
| 2017-03-10 | 0 |
| 2017-03-11 | 0 |
| 2017-03-12 | 0 |
| 2017-03-13 | 0 |
| 2017-03-14 | 0 |
| 2017-03-15 | 0 |
| 2017-03-16 | 0 |
| 2017-03-17 | 0 |
| 2017-03-18 | 0 |
| 2017-03-19 | 0 |
| 2017-03-20 | 0 |
| 2017-03-21 | 0 |
| 2017-03-22 | 0 |
| 2017-03-23 | 0 |
| 2017-03-24 | 0 |
| 2017-03-25 | 0 |
| 2017-03-26 | 0 |
| 2017-03-27 | 0 |
| 2017-03-28 | 0 |
| 2017-03-29 | 0 |
+------------+---------------+
数字和日历表参考:
答案 1 :(得分:0)
您可以使用group by查找计数并使用公用表格
生成日期--Final Query - For this query to execute you require #dates to be populated as below
select cte_start_date as [date], coalesce(cnt,0) as DownLoadCount from #dates d left join
(
select userid,convert(date,DownloadDate) dt , count( convert(date,DownloadDate)) as cnt from #yourDownloads
group by userid, convert(date,DownloadDate)
) a
on d.cte_start_date = a.dt
用于填充该月日期的公用表表达式递归查询
--Populuating dates
declare @start_date date = '02-01-2016'
declare @end_date date = eomonth('02-01-2016')
;WITH CTE AS
(
SELECT @start_date AS cte_start_date
UNION ALL
SELECT DATEADD(DAY, 1, cte_start_date)
FROM CTE
WHERE DATEADD(DAY, 1, cte_start_date) <= @end_date
)
select * into #dates from cte
您的输入表
create table #yourDownloads (UserId int, ImageId int, DownloadDate datetime)
insert into #yourDownloads
( UserID , ImageID , DownloadDate ) values
( 292 , 782 ,'02-01-2016 14:20:22.737')
,( 292 , 783 ,'02-01-2016 14:20:22.737')
,( 292 , 784 ,'02-02-2016 14:20:22.737')
,( 292 , 785 ,'02-04-2016 14:20:22.737')
,( 292 , 786 ,'02-05-2016 14:20:22.737')
,( 292 , 787 ,'02-06-2016 14:20:22.737')