Bug #103145 MySQL 8.0.22 decimal divide decimal has unexpected behavior
Submitted: 30 Mar 5:35 Modified: 30 Mar 20:10
Reporter: yu lei Email Updates:
Status: Not a Bug Impact on me:
None 
Category:MySQL Server: Optimizer Severity:S3 (Non-critical)
Version:8.0.22, 8.0.23 OS:Linux
Assigned to: CPU Architecture:Any

[30 Mar 5:35] yu lei
Description:
As documentation https://dev.mysql.com/doc/refman/8.0/en/arithmetic-functions.html said, result of divide operation has decimal places with lhs's decimal + div_precision_increment(4 by default).

Here's an example:

CREATE TABLE t (a DECIMAL(10, 0));
INSERT INTO t VALUES(1);
SELECT * FROM t WHERE a / 10000;

If the documentation is right, this will produce a result set with 1 row.

However, in MySQL 5.7.33, it will produce an empty result set anyway, whatever if I change the predicate a / 10000 to a / 1000 or a / 100000.

While in MySQL 8.0.22, it will produce result set with 1 row even if I change the predicate a / 10000 to a / 100000, which is expected to be truncated since the datatype of result should be DECIMAL(7,4).

The most strange thing is if the predicate expression is in Projection, it can just work as documentation said. This doesn't make any sense for me at all.

How to repeat:
CREATE TABLE t (a DECIMAL(10, 0));
INSERT INTO t VALUES(1);
SELECT * FROM t WHERE a / 10000;

CREATE TABLE t1 AS SELECT a / 10000 as a FROM t;
SHOW CREATE TABLE t1;
[30 Mar 7:03] MySQL Verification Team
Hello yu lei,

Thank you for the report and test case.

regards,
Umesh
[30 Mar 20:10] Roy Lyseng
This is not a bug in 8.0.
We changed handling of comparison conditions with incomplete predicates (e.g with a single numeric expression), as an inequality against zero. E.g.

  SELECT * FROM t WHERE a / 10000;

is evaluated exactly like

  SELECT * FROM t WHERE a / 10000 <> 0;

In some cases, this caused a slight deviation from results in 5.7.

It may still look strange that (a/100000) is not equal to zero, however decimal operations in MySQL may internally be evaluated with greater precision than what is used externally. In the case of decimal division, the number of decimals used is 9, and thus the result is different from exact zero.

If you want the *exact* intermediate result to have four decimals, you need to cast the division result to e.g. DECIMAL(10,4).