Description:
ps_statement_avg_latency_histogram often fails to draw the point of max(avg_us) as follows.
mysql> SELECT * FROM sys.x$ps_digest_avg_latency_distribution;
+-----+--------+
| cnt | avg_us |
+-----+--------+
| 1 | 130 |
| 1 | 2545 |
| 1 | 6574 |
| 1 | 10336 |
| 1 | 21768 |
| 1 | 27656 |
| 1 | 43610 |
| 1 | 807309 |
+-----+--------+
8 rows in set (0.00 sec)
mysql> CALL ps_statement_avg_latency_histogram()\G
*************************** 1. row ***************************
Performance Schema Statement Digest Average Latency Histogram:
. = 1 unit
* = 2 units
# = 3 units
(0 - 50ms) 7 | .......
(50 - 101ms) 0 |
(101 - 151ms) 0 |
(151 - 202ms) 0 |
(202 - 252ms) 0 |
(252 - 303ms) 0 |
(303 - 353ms) 0 |
(353 - 404ms) 0 |
(404 - 454ms) 0 |
(454 - 504ms) 0 |
(504 - 555ms) 0 |
(555 - 605ms) 0 |
(605 - 656ms) 0 |
(656 - 706ms) 0 |
(706 - 757ms) 0 |
(757 - 807ms) 0 |
Total Statements: 8; Buckets: 16; Bucket Size: 50 ms;
1 row in set (0.01 sec)
How to repeat:
See description.
Suggested fix:
In this procedure, bucket size is defined as:
SELECT ROUND((MAX(avg_us) - MIN(avg_us)) / (@buckets := 16)) AS size
FROM sys.x$ps_digest_avg_latency_distribution
But the lowest range always begins with 0, it should be defined as:
SELECT ROUND(MAX(avg_us) / (@buckets := 16)) AS size
FROM sys.x$ps_digest_avg_latency_distribution
Also, count_in_bucket in the maximum range is defined as:
IFNULL((SELECT SUM(cnt) FROM sys.x$ps_digest_avg_latency_distribution AS b1
WHERE b1.avg_us > @bucket_size * 15 AND b1.avg_us <= @bucket_size * 16), 0))
I think it should be defined as:
IFNULL((SELECT SUM(cnt) FROM sys.x$ps_digest_avg_latency_distribution AS b1
WHERE b1.avg_us > @bucket_size * 15), 0))