我有下表:
CREATE TABLE IF NOT EXISTS `access_log` (
`id` INT(11) NOT NULL AUTO_INCREMENT,
`user_id` INT(11) NOT NULL DEFAULT 0,
`room_id` INT(11) NOT NULL DEFAULT 0,
`created` TIMESTAMP NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
每次用户进入房间room_id
都会添加一条新记录。我想为每个房间选择每个用户的第一条记录和最后一条记录。
目前,我有以下查询似乎未提供正确的记录:
对于每个用户的每个房间的第一记录:
SELECT al.* FROM `access_log` AS `al`
LEFT JOIN `rooms` AS `r` ON al.room_id = r.id
INNER JOIN (
SELECT user_id, room_id, min(created) AS min_date
FROM `access_log`
WHERE `user_id` != 0
GROUP BY user_id, room_id) AS al2
ON al.user_id = al2.user_id AND al.room_id = al2.room_id AND al.created = al2.min_date
WHERE `al`.`created` >= '2019-06-09 00:00:00' AND `al`.`created` <= '2019-06-12 23:59:59'
对于每个用户的每个房间的最后记录:
SELECT al.* FROM `access_log` AS `al`
LEFT JOIN `rooms` AS `r` ON al.room_id = r.id
INNER JOIN (
SELECT user_id, room_id, max(created) AS max_date
FROM `access_log`
WHERE `user_id` != 0
GROUP BY user_id, room_id) AS al2
ON al.user_id = al2.user_id AND al.room_id = al2.room_id AND al.created = al2.max_date
WHERE `al`.`created` >= '2019-06-09 00:00:00' AND `al`.`created` <= '2019-06-12 23:59:59'
这是一个SQLFiddle演示,其中包括示例数据http://www.sqlfiddle.com/#!9/fc5f8b/2。您可以看到查询显示意外的结果。他们没有列出不同的日期,尽管他们列出了不同的房间。另外,第一个查询和最后一个查询的行数也不同。
相同的DDL:
CREATE TABLE IF NOT EXISTS `access_log` ( `id` INT(11) NOT NULL AUTO_INCREMENT, `user_id` INT(11) NOT NULL DEFAULT 0, `room_id` INT(11) NOT NULL DEFAULT 0, `created` TIMESTAMP NOT NULL, PRIMARY KEY (`id`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8; INSERT INTO `access_log` (`id`, `user_id`, `room_id`, `created`) VALUES (1, 90000017, 6, '2019-06-10 01:15:00'), (2, 90000017, 6, '2019-06-10 01:25:00'), (3, 90000018, 6, '2019-06-10 02:15:00'), (4, 90000018, 6, '2019-06-10 02:25:00'), (5, 90000019, 6, '2019-06-10 03:15:00'), (6, 90000019, 6, '2019-06-10 03:25:00'), (7, 90000017, 5, '2019-06-10 11:15:00'), (8, 90000017, 5, '2019-06-10 11:25:00'), (9, 90000018, 5, '2019-06-10 12:15:00'), (10, 90000018, 5, '2019-06-10 12:25:00'), (11, 90000019, 5, '2019-06-10 13:15:00'), (12, 90000019, 5, '2019-06-10 13:25:00'), (13, 90000017, 6, '2019-06-11 04:10:00'), (14, 90000017, 6, '2019-06-11 04:20:00'), (15, 90000018, 6, '2019-06-11 05:10:00'), (16, 90000018, 6, '2019-06-11 05:20:00'), (17, 90000019, 6, '2019-06-11 06:10:00'), (18, 90000019, 6, '2019-06-11 06:20:00'), (19, 90000017, 5, '2019-06-11 14:10:00'), (20, 90000017, 5, '2019-06-11 14:20:00'), (21, 90000018, 5, '2019-06-11 15:10:00'), (22, 90000018, 5, '2019-06-11 15:20:00'), (23, 90000019, 5, '2019-06-11 16:20:00'), (24, 90000019, 5, '2019-06-11 16:20:00');
预期结果应该类似于:
First per user per room per day +------+-----------+---------+---------------------+ | id | user_id | room_id | created | +------+-----------+---------+---------------------+ | 1 | 90000017 | 6 | 2019-06-10 01:15:00 | | 3 | 90000018 | 6 | 2019-06-10 02:15:00 | | 5 | 90000019 | 6 | 2019-06-10 03:15:00 | | 7 | 90000017 | 5 | 2019-06-10 11:15:00 | | 9 | 90000018 | 5 | 2019-06-10 12:15:00 | | 11 | 90000019 | 5 | 2019-06-10 13:15:00 | | 13 | 90000017 | 6 | 2019-06-11 04:10:00 | | 15 | 90000018 | 6 | 2019-06-11 05:10:00 | | 17 | 90000019 | 6 | 2019-06-11 06:10:00 | | 19 | 90000017 | 5 | 2019-06-11 14:10:00 | | 21 | 90000018 | 5 | 2019-06-11 15:10:00 | | 23 | 90000019 | 5 | 2019-06-11 16:20:00 | +------+-----------+---------+---------------------+ Last per user per room per day +------+-----------+---------+---------------------+ | id | user_id | room_id | created | +------+-----------+---------+---------------------+ | 2 | 90000017 | 6 | 2019-06-10 01:25:00 | | 4 | 90000018 | 6 | 2019-06-10 02:25:00 | | 6 | 90000019 | 6 | 2019-06-10 03:25:00 | | 8 | 90000017 | 5 | 2019-06-10 11:25:00 | | 10 | 90000018 | 5 | 2019-06-10 12:25:00 | | 12 | 90000019 | 5 | 2019-06-10 13:25:00 | | 14 | 90000017 | 6 | 2019-06-11 04:20:00 | | 16 | 90000018 | 6 | 2019-06-11 05:20:00 | | 18 | 90000019 | 6 | 2019-06-11 06:20:00 | | 20 | 90000017 | 5 | 2019-06-11 14:20:00 | | 22 | 90000018 | 5 | 2019-06-11 15:20:00 | | 24 | 90000019 | 5 | 2019-06-11 16:20:00 | +------+-----------+---------+---------------------+
答案 0 :(得分:1)
我建议通过以下单个查询对预期结果进行交叉检查:
SELECT
GROUP_CONCAT(id ORDER BY created,id SEPARATOR ' ') all_id,
-- this return all id present in the group
SUBSTRING_INDEX(GROUP_CONCAT(id ORDER BY created,id SEPARATOR ' '),' ',1) min_id_in,
-- this part is taking the first value from the GROUP_CONCAT operation above
SUBSTRING_INDEX(GROUP_CONCAT(id ORDER BY created,id SEPARATOR ' '),' ',-1) max_id_in,
-- this part is taking the last value from the first GROUP_CONCAT operation
user_id,room_id,
MIN(created),
MAX(created) -- Min/max value are both shown in same query
FROM access_log
GROUP BY user_id,room_id,
date(created); -- the missing condition where OP's asks results to return by each date.
我在date(created)
条件下添加了GROUP BY ..
。
在小提琴的原始查询中:
SELECT al.* FROM `access_log` AS `al`
INNER JOIN (
SELECT user_id, room_id, min(created) AS min_date
FROM `access_log`
WHERE `user_id` != 0
GROUP BY user_id, room_id,
date(created) -- I've added the condition here
) AS al2
ON al.user_id = al2.user_id AND al.room_id = al2.room_id AND al.created = al2.min_date
WHERE `al`.`created` >= '2019-06-09 00:00:00' AND `al`.`created` <= '2019-06-12 23:59:59'
ORDER BY al.user_id ASC;
SELECT al.* FROM `access_log` AS `al`
INNER JOIN (
SELECT user_id, room_id, max(created) AS max_date
FROM `access_log`
WHERE `user_id` != 0
GROUP BY user_id, room_id,
date(created) -- and here
) AS al2
ON al.user_id = al2.user_id AND al.room_id = al2.room_id AND al.created = al2.max_date
WHERE `al`.`created` >= '2019-06-09 00:00:00' AND `al`.`created` <= '2019-06-12 23:59:59'
ORDER BY al.user_id ASC;
答案 1 :(得分:0)
子查询应从与主查询相同的表中进行选择,因此应从del
而非rm /outbound/test/FileName_A_*
rm /outbound/test/FileName_B_*
中进行选择。
access_log