在我的查询中
SELECT
[Dt],
[ItemRelation],
[DocumentNum],
[DocumentDate],
[CalendarYear]
FROM
[Action].[dbo].[testtable]
这些列正在分组:
[ItemRelation]
[DocumentNum]
[CalendarYear]
我必须删除的任何组。
这里是查询数据
DECLARE @LIST_ABOVE TABLE (ItemRelation NVARCHAR(10),
DocumentNum NVARCHAR(10),
CalendarYear INT)
INSERT INTO @LIST_ABOVE (ItemRelation, DocumentNum, CalendarYear)
VALUES
(11511,5,2017),
(11628,2,2017),
(11661,163,2017),
(11692,82,2017),
(11709,143,2017),
(13189,33,2017),
(13284,2,2017),
(158009,12,2017),
(158121,63,2017),
(11514,60,2017),
(11628,3,2017),
(11671,13,2017),
(11706,8,2017),
(11741,163,2017),
(13191,7,2017),
(13284,3,2017),
(158010,12,2017),
(158122,41,2017),
(11592,33,2017),
(11628,140,2017),
(11683,70,2017),
(11706,50,2017),
(13163,70,2017),
(13191,33,2017),
(13322,4,2017),
(158010,89,2017),
(158122,62,2017),
(11594,9,2017),
(11633,75,2017),
(11683,140,2017),
(11706,51,2017),
(13163,75,2017),
(13250,83,2017),
(13322,36,2017),
(158010,95,2017),
(158122,63,2017),
(11623,71,2017),
(11634,154,2017),
(11683,154,2017),
(11706,58,2017),
(13163,131,2017),
(13269,50,2017),
(157186,57,2017),
(158121,41,2017),
(11626,29,2017),
(11661,143,2017),
(11683,163,2017),
(11709,81,2017),
(13189,13,2017),
(13269,66,2017),
(157192,56,2017),
(158121,62,2017)
当我运行此脚本时
DELETE T FROM [Action].[dbo].testtable T
WHERE EXISTS (SELECT 1
FROM @LIST_ABOVE
WHERE T.[ItemRelation] = [ItemRelation]
AND T.[DocumentNum] = [DocumentNum]
AND T.[CalendarYear] = [CalendarYear]);
从表中,未删除具有上述组的行。 每个组有40行。
因此必须删除40 * 52 = 2080行。
如何删除带有该组的行?
我只是不想手动进行。但是我无法删除它们。
答案 0 :(得分:1)
首先,您已将ItemRelation和DocumentNum声明为NVARCHAR(10),但将它们作为整数插入。本身不是问题,但是如果testtable中的数据类型是数字,那么数据类型不匹配可以解释它。
DELETE T
FROM [Action].[dbo].testtable AS T
INNER JOIN @LIST_ABOVE AS LA
ON LA.ItemRelation = T.ItemRelation
AND LA.DocumentNum = T.DocumentNum
AND LA.CalendarYear = T.CalendarYear
如果这不起作用,那么我建议尝试删除@LIST_ABOVE中的第一项
DELETE FROM [Action].[dbo].testtable AS T
WHERE T.ItemRelation = 11511
AND T.DocumentNum = 5
AND T.CalendarYear = 2017
最后,您需要确定为什么不发生删除。数据类型不匹配的可能性最大,[Action]。[dbo] .testtable中不存在数据。