SCENARIO
我需要根据test_userData
列或test_userCheck
上customer
的一对一匹配从account_info
中选择记录。下面的代码将创建表的模型,并将填充随机数据以用于我的问题。基于此代码,它正在查找test_userData.customer = 'Guerrero, Unity'
或test_userData.account_info = 'XXXXXXXXXXXXXXXX0821'
的任何记录,并应返回三行(confirmation_id = 6836985,5502798和3046441)
问题
按照目前的情况,查询返回我需要的内容...但是,我的真实userData
表有近200万条记录,而userCheck
表大约有10,000条。查询大约需要7秒,我觉得这太长了。我也很担心,因为userData
表会开始快速增长(每天有成千上万条独特的记录),并且我设想我当前的方法变得无法管理。
问题
关于如何利用数百万条记录优化扩展的任何想法?数据驻留在具有有限权限的共享SQL 2008服务器上。
--setup temporary testing tables
IF EXISTS
(
SELECT * FROM dbo.sysobjects
WHERE id = object_id(N'[dbo].[test_userData]')
AND OBJECTPROPERTY(id, N'IsUserTable') = 1
)
DROP TABLE [dbo].[test_userData]
GO
IF EXISTS
(
SELECT * FROM dbo.sysobjects
WHERE id = object_id(N'[dbo].[test_userCheck]')
AND OBJECTPROPERTY(id, N'IsUserTable') = 1
)
DROP TABLE [dbo].[test_userCheck]
GO
CREATE TABLE [dbo].[test_userData](
[id] [int] IDENTITY(1,1) NOT NULL,
[merchant_id] [int] NOT NULL,
[sales_date] [datetime] NOT NULL,
[confirmation_id] [int] NOT NULL,
[customer] [nvarchar](max) NOT NULL,
[total] [smallmoney] NOT NULL,
[account_info] [nvarchar](max) NOT NULL,
[email_address] [nvarchar](max) NOT NULL
CONSTRAINT [PK_test_userData] PRIMARY KEY CLUSTERED
(
[id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
CREATE TABLE [dbo].[test_userCheck](
[confirmation_id] [int] NOT NULL,
[customer] [nvarchar](max) NOT NULL,
[total] [smallmoney] NOT NULL,
[account_info] [nvarchar](max) NOT NULL
CONSTRAINT [PK_test_userCheck] PRIMARY KEY CLUSTERED
(
[confirmation_id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
--insert some random user transactions
INSERT INTO [dbo].[test_userData] (merchant_id,sales_date,confirmation_id,customer,total,account_info,email_address) VALUES
('99','03/25/2010','3361424','Soto, Ahmed','936','XXXXXXXXXXXXXXXX8744','Donec.egestas@NullainterdumCurabitur.ca'),
('17','09/12/2010','6710165','Holcomb, Eden','1022','XXXXXXXXXXXXXXXX6367','Curabitur@dolortempus.org'),
('32','05/04/2010','4489509','Foster, Nasim','1463','XXXXXXXXXXXXXXXX7115','augue.eu.tellus@ullamcorperviverraMaecenas.ca'),
('95','01/02/2011','5384061','Browning, Owen','523','XXXXXXXXXXXXXXXX0576','sed.dictum.eleifend@accumsaninterdum.edu'),
('91','08/21/2010','6075234','Dawson, McKenzie','141','XXXXXXXXXXXXXXXX3580','dolor.sit.amet@etmagnis.org'),
('63','01/29/2010','1055619','Mathews, Keefe','1110','XXXXXXXXXXXXXXXX2682','ligula@Sednuncest.edu'),
('27','10/20/2010','1819662','Clarke, Briar','1474','XXXXXXXXXXXXXXXX7481','Donec.non.justo@malesuada.org'),
('82','03/05/2010','3184936','Holman, Dana','560','XXXXXXXXXXXXXXXX7080','Aenean.eget.magna@accumsan.edu'),
('24','06/11/2010','1007427','Kirk, Desiree','206','XXXXXXXXXXXXXXXX3681','parturient@at.com'),
('49','06/17/2010','6137066','Foley, Sopoline','1831','XXXXXXXXXXXXXXXX1718','ac.urna.Ut@pellentesqueafacilisis.org'),
('22','05/08/2010','3545367','Howell, Uriel','638','XXXXXXXXXXXXXXXX1945','ad.litora@arcuvelquam.ca'),
('5','10/25/2010','6836985','Little, Caryn','743','XXXXXXXXXXXXXXXX0821','Suspendisse.aliquet@auctor.org'),
('91','06/16/2010','6852582','Buckner, Chiquita','99','XXXXXXXXXXXXXXXX1533','tellus.sem@semvitaealiquam.edu'),
('63','06/12/2010','7930230','Nolan, Wyoming','1192','XXXXXXXXXXXXXXXX1291','Sed@diam.org'),
('32','02/01/2010','8407102','Cummings, Deacon','1315','XXXXXXXXXXXXXXXX4375','a.odio.semper@massaSuspendisseeleifend.ca'),
('75','06/29/2010','5502798','Guerrero, Unity','858','XXXXXXXXXXXXXXXX8000','eget@lectus.edu'),
('50','09/13/2010','8312525','Russo, Yvette','1680','XXXXXXXXXXXXXXXX2046','In.mi@eu.com'),
('11','04/13/2010','6204132','Small, Calista','426','XXXXXXXXXXXXXXXX0269','lacus@Cumsociisnatoque.org'),
('16','01/01/2011','7522507','Mosley, Thor','1459','XXXXXXXXXXXXXXXX8451','netus.et@Pellentesqueutipsum.com'),
('5','01/27/2010','1472120','Case, Kiona','1419','XXXXXXXXXXXXXXXX7097','Duis@duilectusrutrum.edu'),
('70','02/17/2010','1095935','Snyder, Tanner','1655','XXXXXXXXXXXXXXXX8556','metus.sit.amet@inconsequatenim.edu'),
('63','11/10/2010','3046441','Guerrero, Unity','629','XXXXXXXXXXXXXXXX0807','nonummy.ac.feugiat@Phasellusdapibus.org'),
('22','08/19/2010','5435100','Turner, Patrick','1133','XXXXXXXXXXXXXXXX6734','pede@Duis.edu'),
('96','10/05/2010','6381992','May, Dominic','1858','XXXXXXXXXXXXXXXX7227','hymenaeos@etcommodo.edu'),
('96','02/26/2010','8630748','Chandler, Olympia','1016','XXXXXXXXXXXXXXXX4001','sed.dui.Fusce@pellentesqueSed.com');
--insert a random fraud transaction to check against (based on customer and account_info only)
INSERT INTO [dbo].[test_userCheck] (confirmation_id, customer, total, account_info) VALUES
('2055015', 'Guerrero, Unity', '20.02', 'XXXXXXXXXXXXXXXX0821');
--get result, which is correct
SELECT a.confirmation_id, a.customer, a.total, a.account_info, a.email_address
FROM dbo.test_userData AS a RIGHT OUTER JOIN
dbo.test_userCheck AS b ON a.customer = b.customer OR a.account_info = b.account_info;
DROP TABLE [dbo].[test_userData];
DROP TABLE [dbo].[test_userCheck];
答案 0 :(得分:1)
创建适当的索引。根据您的问题,我建议使用两个索引,一个在test_userData.customer
,另一个索引在test_userData.account_info
答案 1 :(得分:0)
创建索引可能会有所帮助,但您是否考虑过另一种符合普通表单的设计。如果通过整数列上的索引访问日期而不是字符串...
会更好