Bug #98324 | Deadlocks more frequent since version 5.7.26 | ||
---|---|---|---|
Submitted: | 22 Jan 2020 9:57 | Modified: | 1 Apr 2020 14:14 |
Reporter: | Przemyslaw Malkowski | Email Updates: | |
Status: | Verified | Impact on me: | |
Category: | MySQL Server: InnoDB storage engine | Severity: | S2 (Serious) |
Version: | 5.7.26, 5.7.29, 8.0.20 | OS: | Any |
Assigned to: | CPU Architecture: | Any | |
Tags: | deadlock, innodb |
If the data you need to attach is more than 50MB, you should create a compressed archive of the data, split it to 50MB chunks, and upload each of them as a separate attachment.
To split a large file:
- On *nix platforms use the split command e.g.
split -b 50MB <my_large_archive> <my_split_archive_prefix>
- On windows use WinZip or a similar utility to split the large file
[23 Jan 2020 7:54]
Umesh Shastry
[6 Feb 2020 3:13]
Stephen Wei
The test script for daedlock.
Attachment: deadlock-test.py (text/x-python-script), 2.45 KiB.
[2 Mar 2020 19:15]
Thanh Nguyen
My Test script . run by ./deadlock.sh
Attachment: deadlock.sh (text/x-sh), 634 bytes.
[9 Apr 2021 6:28]
yz c
1000 rows batch sql
Attachment: insert_1000_rows.sql (application/octet-stream, text), 241.44 KiB.
[9 Apr 2021 6:28]
yz c
1000 rows batch sql
Attachment: insert_1000_rows.sql (application/octet-stream, text), 241.44 KiB.
[9 Apr 2021 6:28]
yz c
15 rows batch sql
Attachment: insert_15_rows.sql (application/octet-stream, text), 3.84 KiB.