我已经搜索了论坛,但要么不能正确地提出问题,要么不理解答案,需要有人带我一步一步。
问题是: 我的数据库中有一个表,用户。根据电子邮件比较,有一些重复。现在,根据注册日期,其中一些具有更高的优先级(我们将省略具有较早注册日期的那些),但是一些具有较低优先级日期的记录具有更多信息填充(例如性别,地址,电话等)上)。
我想得到的流程是: - >根据电子邮件查找重复项 - >使用最新注册日期确定行的优先级 - >如果此行中的单元格为空,请使用优先级较低的行中的数据填充
P.S。 问题是,最多可能有三个重复的帐户使用相同的电子邮件。
我不能把头缠在这里.. What I have what I want
CREATE TABLE [dbo].[Person](
[userID] [nvarchar] PRIMARY KEY,
[email] [nvarchar] (50),
[priority] [nvarchar](2),
[FirstName] [nvarchar](50),
[LastName] [nvarchar](50)
)
GO
INSERT INTO Person VALUES (1,'a@a.com','1','','');
INSERT INTO Person VALUES (2,'a@a.com','2','Dennis','Li');
INSERT INTO Person VALUES (3,'b@b.com','1','Brent','Li');
INSERT INTO Person VALUES (4,'c@c.com','1','','');
INSERT INTO Person VALUES (5,'c@c.com','2','','Raji');
INSERT INTO Person VALUES (6,'c@c.com','3','Ben','Raji');
GO
答案 0 :(得分:0)
使用下面的脚本,我们将注册表保留在最近的regDate
中,并使用之前的注册表填充NULL值以用于同一封电子邮件。
但是,如果您有三个或更多用户使用相同的电子邮件,则会忽略较旧的行,我们只会将最新的行与最新的行合并:
INSERT INTO Users ([email],[firstName],[lastName],[street],[city],[code],[country],[phone],[regDate])
VALUES ('a@a.com', 'Andrew', null, null, null, null, null, null, '2018-03-09 00:00:00');
INSERT INTO Users ([email],[firstName],[lastName],[street],[city],[code],[country],[phone],[regDate])
VALUES ('a@a.com', 'ANDREEW', 'Lopez', null, 'Santos', null, null, null, '2018-03-08 00:00:00');
INSERT INTO Users ([email],[firstName],[lastName],[street],[city],[code],[country],[phone],[regDate])
VALUES ('b@b.com', 'Bob', 'Wilk', null, null, null, null, null, '2018-03-10 00:00:00');
INSERT INTO Users ([email],[firstName],[lastName],[street],[city],[code],[country],[phone],[regDate])
VALUES ('b@b.com', 'Robert', null, 'Sandiego Street', 'Santos', null, null, '456 123 789', '2018-03-05 00:00:00');
SELECT * FROM Users;
INSERT INTO Users ([email],[firstName],[lastName],[street],[city],[code],[country],[phone],[regDate])
SELECT
u.[email],
ISNULL(u.firstName,old.firstName),
ISNULL(u.lastName,old.lastName),
ISNULL(u.street,old.street),
ISNULL(u.city,old.city),
ISNULL(u.code,old.code),
ISNULL(u.country,old.country),
ISNULL(u.phone,old.phone),
u.regDate
FROM Users u
INNER JOIN Users old ON old.Id = (SELECT TOP 1 Id FROM Users oldMax WHERE oldMax.email = u.email AND oldMax.Id <> u.Id ORDER BY oldMax.regDate DESC)
WHERE u.Id = (SELECT TOP 1 new.Id From Users new WHERE new.email = u.email ORDER BY new.regDate DESC);
DELETE FROM Users WHERE Id NOT IN (SELECT MAX(Id) FROM Users GROUP BY email);
SELECT * FROM Users;
Here你是一个工作小提琴。
答案 1 :(得分:0)
这应该这样做
declare @T TABLE (
[userID] int PRIMARY KEY,
[email] [nvarchar] (50),
[priority] tinyint,
[FirstName] [nvarchar](50),
[LastName] [nvarchar](50)
);
INSERT INTO @T VALUES
(1,'a@a.com', 1, null, null)
, (2,'a@a.com', 2, 'Dennis','Li')
, (3,'b@b.com', 1, 'Brent','Li')
, (4,'c@c.com', 1, null,null)
, (5,'c@c.com', 2, null,'Raji')
, (6,'c@c.com', 3, 'Ben','Raji');
select t1.email
, (select top 1 tt.FirstName from @T tt where tt.FirstName is not null and tt.email = t1.email order by tt.priority asc) as FN
, (select top 1 tt.LastName from @T tt where tt.LastName is not null and tt.email = t1.email order by tt.priority asc) as LN
from @T t1
group by t1.email
order by t1.email;
答案 2 :(得分:0)
下一个CTE只显示重复的电子邮件数据。如果您需要一个适用于重复电子邮件和非重复电子邮件的查询,您应该删除第一个CTE并完成!
;WITH DuplicatedEmails AS
(
SELECT
P.Email
FROM
Person AS p
GROUP BY
P.Email
HAVING
COUNT(1) > 1
),
DuplicatedEmailUserData AS
(
SELECT
P.*,
EmailRanking = ROW_NUMBER() OVER (PARTITION BY Email ORDER BY Priority DESC) -- Assuming a higher priority comes first
FROM
Persons AS P
INNER JOIN DuplicatedEmails AS E ON P.Email = E.Email
)
SELECT
D1.UserID,
D1.Email,
D1.Priority,
FirstName = COALESCE(D1.FirstName, D2.FirstName, D3.Firstname), -- Use COALESCE for the columns that might be NULL on 1st record
LastName = COALESCE(D1.LastName, D2.LastName, D3.Lastname)
FROM
DuplicatedEmailUserData AS D1
LEFT JOIN DuplicatedEmailUserData AS D2 ON
D1.Email = D2.Email AND
D1.EmailRanking + 1 = D2.EmailRanking
LEFT JOIN DuplicatedEmailUserData AS D3 ON
D1.Email = D3.Email AND
D2.EmailRanking + 1 = D3.EmailRanking
WHERE
D1.EmailRanking = 1
使用此方法,您可能需要LEFT JOIN
与重复的电子邮件一样多次。