Bug #92483 SELECT MAX GROUP BY failing with WHERE clause
Submitted: 18 Sep 2018 14:06 Modified: 30 Oct 2018 14:33
Reporter: Leif Mortenson Email Updates:
Status: No Feedback Impact on me:
None 
Category:MySQL Server: DML Severity:S2 (Serious)
Version:8.0.12 OS:Windows (10)
Assigned to: CPU Architecture:x86

[18 Sep 2018 14:06] Leif Mortenson
Description:
The following simple query fails to return any records with version 8.0.12.  It works correctly in 5.1.14

---
SELECT tag_id, MAX(datetime) AS maxdt
  FROM online_samples
 WHERE datetime <= '2018-09-18 19:06:05'
 GROUP BY tag_id;
---
 

How to repeat:
Version information:
+-------------------------+------------------------------+
| Variable_name           | Value                        |
+-------------------------+------------------------------+
| innodb_version          | 8.0.12                       |
| protocol_version        | 10                           |
| slave_type_conversions  |                              |
| tls_version             | TLSv1,TLSv1.1,TLSv1.2        |
| version                 | 8.0.12                       |
| version_comment         | MySQL Community Server - GPL |
| version_compile_machine | x86_64                       |
| version_compile_os      | Win64                        |
| version_compile_zlib    | 1.2.11                       |
+-------------------------+------------------------------+

I have the following table:
---
CREATE TABLE `online_samples` (
  `datetime` datetime NOT NULL,
  `tag_id` varchar(40) CHARACTER SET utf8 COLLATE utf8_unicode_ci NOT NULL,
  `value` float DEFAULT NULL,
  `wrtime` datetime NOT NULL,
  PRIMARY KEY (`datetime`,`tag_id`),
  UNIQUE KEY `online_samples_index_tag_id_datetime` (`tag_id`,`datetime` DESC),
  KEY `online_samples_index_wrtime` (`wrtime`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
---

If I run the following query I get results as follows:
---
 SELECT tag_id, datetime
  FROM online_samples
 WHERE datetime <= '2018-09-18 19:06:05'
 LIMIT 10; 
---
Results:
+------------+---------------------+
| tag_id     | datetime            |
+------------+---------------------+
| test1_ABCD | 2018-08-23 13:37:25 |
| test1_E    | 2018-08-23 13:37:25 |
| test1_FGH  | 2018-08-23 13:37:25 |
| test1_ABCD | 2018-08-23 13:37:30 |
| test1_E    | 2018-08-23 13:37:30 |
| test1_FGH  | 2018-08-23 13:37:30 |
| test1_ABCD | 2018-08-23 13:37:35 |
| test1_E    | 2018-08-23 13:37:35 |
| test1_FGH  | 2018-08-23 13:37:35 |
| test1_ABCD | 2018-08-23 13:37:40 |
+------------+---------------------+
10 rows in set (0.00 sec)

If I do a simple group by without restricting by datetime it works:
---
SELECT tag_id, MAX(datetime)
  FROM online_samples
 GROUP BY tag_id;
---
Results:
+--------------+---------------------+
| tag_id       | MAX(datetime)       |
+--------------+---------------------+
| test3t1_IJ   | 2018-09-18 19:00:56 |
| test3t1_ABCD | 2018-09-18 18:42:05 |
| test3t1_KL   | 2018-09-18 18:56:43 |
| test4t2_IJ   | 2018-09-18 19:05:25 |
| test4t2_ABCD | 2018-09-18 19:05:25 |
| test4t2_KL   | 2018-09-18 19:05:25 |
| test4t2_O    | 2018-09-18 19:05:25 |
| test4t2_MN   | 2018-09-18 19:05:25 |
| test4t3_IJ   | 2018-08-29 11:52:55 |
| test4t3_ABCD | 2018-08-29 11:52:55 |
| test4t3_KL   | 2018-08-29 11:52:55 |
| test4t3_O    | 2018-08-29 11:52:55 |
| test4t3_MN   | 2018-08-29 11:52:55 |
| test1_ABCD   | 2018-09-18 19:05:25 |
| test1_E      | 2018-09-18 19:05:25 |
| test1_FGH    | 2018-09-18 19:05:25 |
| test2_E      | 2018-08-29 16:41:00 |
| test2_FGH    | 2018-08-29 16:40:00 |
+--------------+---------------------+

But if I add the two together as follows I get no results:
---
SELECT tag_id, MAX(datetime) AS maxdt
  FROM online_samples
 WHERE datetime <= '2018-09-18 19:06:05'
 GROUP BY tag_id;
---
Results:
Empty set (0.00 sec)

This is an incredibly simple query so it has been driving me crazy as to what the problem could be.

If I do the same exact thing with version 5.1.14 on Linux or on Windows it works perfectly:
+--------------+---------------------+
| tag_id       | maxdt               |
+--------------+---------------------+
| test3t1_IJ   | 2018-09-18 19:00:56 |
| test3t1_ABCD | 2018-09-18 18:42:05 |
| test3t1_KL   | 2018-09-18 18:56:43 |
| test4t2_IJ   | 2018-09-18 19:05:25 |
| test4t2_ABCD | 2018-09-18 19:05:25 |
| test4t2_KL   | 2018-09-18 19:05:25 |
| test4t2_O    | 2018-09-18 19:05:25 |
| test4t2_MN   | 2018-09-18 19:05:25 |
| test4t3_IJ   | 2018-08-29 11:52:55 |
| test4t3_ABCD | 2018-08-29 11:52:55 |
| test4t3_KL   | 2018-08-29 11:52:55 |
| test4t3_O    | 2018-08-29 11:52:55 |
| test4t3_MN   | 2018-08-29 11:52:55 |
| test1_ABCD   | 2018-09-18 19:05:25 |
| test1_E      | 2018-09-18 19:05:25 |
| test1_FGH    | 2018-09-18 19:05:25 |
| test2_E      | 2018-08-29 16:41:00 |
| test2_FGH    | 2018-08-29 16:40:00 |
+--------------+---------------------+
18 rows in set (0.00 sec)

Any Idea what the problem could be?  I had to revert to the old DB version again to get things working.
[18 Sep 2018 14:12] Leif Mortenson
Correct Synopsis
[18 Sep 2018 16:28] MySQL Verification Team
Thank you for the bug report. Please provide the dump file with data for to run the query. (Use the tab Files for). Thanks.
[20 Sep 2018 4:28] Leif Mortenson
This was a customer machine with proprietary information in the DB.
I had exported the data and reimported it back into the old DB.
It was possible to reproduce this with several queries though.
I would need to set up a new machine again to try this out.  Let me know if you need me to do this.

The table schema is in the report.  There were several thousand rows matching the pattern of data shown.  No external joins or anything were required to reproduce.

Cheers
[30 Sep 2018 14:33] MySQL Verification Team
Thank you for the feedback. I tried with my own data and couldn't repeat so I should need you provide a dump file with insert data and repeatable the issue on your side. Thanks.
[1 Nov 2018 1:00] Bugs System
No feedback was provided for this bug for over a month, so it is
being suspended automatically. If you are able to provide the
information that was originally requested, please do so and change
the status of the bug back to "Open".