MAX上的MySQL加入

时间:2016-08-01 19:18:53

标签: mysql

我正在尝试加入两个My SQL Tables,它们被简化为:

+----------------------------+
| customers                  |
+-------------+-------+------+
| customer_id | first | last |
+-------------+-------+------+
| 0           | John  | Doe  |
+-------------+-------+------+
| 1           | Jane  | Doe  |
+-------------+-------+------+

+-------------------------------------------------------------------+
| contact_log                                                       |
+----------------+-------------+--------------+---------------------+
| contact_log_id | customer_id | contact_type | date_time           |
+----------------+-------------+--------------+---------------------+
| 0              | 0           | email        | 2016-05-17 03:21:45 |
+----------------+-------------+--------------+---------------------+
| 1              | 0           | phone        | 2016-05-17 16:11:35 |
+----------------+-------------+--------------+---------------------+
| ...            | ...         | ...          |                     |
+----------------+-------------+--------------+---------------------+

我需要一个查询来选择客户,以及他们最近的联系时间和类型。我已尝试过此查询:

SELECT
    `customers`.`customer_id`,
    `customers`.`first`,
    `customers.last`,
    `contact_log`.`contact_type`,
    MAX(`contact_log`.`date_time`)
FROM
    `customers`
JOIN
    `contact_log`
ON
    `customers`.`customer_id` = `contact_log`.`customer_id`

这通常会错误地排序date_time。在研究该问题时,某些MySQL版本中存在一个错误,其中MAXMIN无法与DATETIME正常工作。所以解决方法是

MAX(CAST(`contact_log`.`date_time` AS CHAR))

然后我获得了最新的date_time客户行。但是,contact_type与时间不匹配。在示例数据中,我的结果如下:

+-------------+-------+------+--------------+---------------------+
| customer_id | first | last | contact_type | date_time           |
+-------------+-------+------+--------------+---------------------+
| 0           | John  | Doe  | email        | 2016-05-17 16:11:35 |
+-------------+-------+------+--------------+---------------------+

contact_typedate_time表中的contact_log不匹配。我怀疑这与SELECT / JOIN正在发生的顺序以及何时进行过滤有关。我必须小心子查询(避免n + 1),因为这些是非常大的表,并且可能会从两个表中选择数百行。

contact_typedate_time匹配的正确查询是什么?

更新 当我最初问这个问题时,我没有意识到你在视图中没有子查询。这需要保存为视图。为了完整解决这个问题,如何将其分解为多个视图并将其合并为一个?

3 个答案:

答案 0 :(得分:2)

没有观看

一个简单的解决方案是使用子查询来获取按日期排序的联系日志,由全局查询调用以按customer_id对它们进行分组:

SELECT * FROM
(
    SELECT
        customers.customer_id,
        customers.first,
        customers.last,
        contact_log.contact_type,
        contact_log.date_time

        FROM customers
            INNER JOIN contact_log ON contact_log.customer_id = customers.customer_id -- or LEFT JOIN - see comment

        ORDER BY contact_log.date_time DESC
) logs GROUP BY logs.customer_id

如果您拥有庞大的数据库,则必须检查架构是否已正确建立索引,启用缓存等...

使用观看次数

逻辑是一样的。子查询被第一个视图替换,第一个视图由全局"全局"查看分组结果。请注意,我在" logs"中使用了GROUP BY而不是ORDER BY。图。

CREATE VIEW logs AS 
    SELECT
        customers.customer_id,
        customers.first,
        customers.last,
        contact_log.contact_type,
        contact_log.date_time

        FROM customers
            LEFT JOIN contact_log ON contact_log.customer_id = customers.customer_id

        GROUP BY
            customers.customer_id,
            contact_log.date_time DESC,
            contact_log.contact_type DESC;

CREATE VIEW testview AS SELECT * FROM logs GROUP BY logs.customer_id;

SELECT * FROM testview;

答案 1 :(得分:0)

您的问题是您使用的MAX没有GROUP BY,因此您获取的是所有记录的最大数据,而不是每个用户。
我使用内部查询只获取最大日期,然后加入它:

SELECT
    customers.customer_id,
    customers.first,
    customers.last,
    max_contact_log.contact_type,
    max_contact_log.date_time
FROM
    customers
JOIN
    (select customer_id, contact_type, max(date_time) AS date_time
      FROM contact_log GROUP BY customer_id
    ) as max_contact_log 
ON
    customers.customer_id = max_contact_log.customer_id;

答案 2 :(得分:0)

没有子查询,这是一个使用having子句的解决方案:

select c.*, cl.contact_type, cl.date_time
from customers c
join contact_log cl
on c.customer_id = cl.customer_id
left join contact_log t
on cl.customer_id = t.customer_id
and cl.date_time <= t.date_time
group by c.customer_id, c.`first`, c.`last`, cl.contact_type, cl.date_time
having count(*) <= 1

Demo Here