Bug #54436 | Deadlock on concurrent FLUSH WITH READ LOCK, ALTER TABLE, ANALYZE TABLE | ||
---|---|---|---|
Submitted: | 11 Jun 2010 16:55 | Modified: | 17 Mar 2011 19:04 |
Reporter: | Matthias Leich | Email Updates: | |
Status: | Won't fix | Impact on me: | |
Category: | MySQL Server: Locking | Severity: | S3 (Non-critical) |
Version: | 5.1.38-bzr, mysql-trunk-runtime | OS: | Any |
Assigned to: | CPU Architecture: | Any | |
Tags: | deadlock |
If the data you need to attach is more than 50MB, you should create a compressed archive of the data, split it to 50MB chunks, and upload each of them as a separate attachment.
To split a large file:
- On *nix platforms use the split command e.g.
split -b 50MB <my_large_archive> <my_split_archive_prefix>
- On windows use WinZip or a similar utility to split the large file
[11 Jun 2010 17:10]
Matthias Leich