Bug #56965 | mysqldump poor performance with large number of tables | ||
---|---|---|---|
Submitted: | 23 Sep 2010 9:22 | Modified: | 29 Sep 2010 9:33 |
Reporter: | Peter Surda | Email Updates: | |
Status: | Verified | Impact on me: | |
Category: | MySQL Server: mysqldump Command-line Client | Severity: | S3 (Non-critical) |
Version: | 5.1.44 | OS: | Linux (2.6.18-164.6.1.el5 #1 SMP, CentOS 5.5) |
Assigned to: | CPU Architecture: | Any | |
Tags: | Contribution, mysqldump, performance, speed |
If the data you need to attach is more than 50MB, you should create a compressed archive of the data, split it to 50MB chunks, and upload each of them as a separate attachment.
To split a large file:
- On *nix platforms use the split command e.g.
split -b 50MB <my_large_archive> <my_split_archive_prefix>
- On windows use WinZip or a similar utility to split the large file
[23 Sep 2010 9:41]
Peter Surda
[23 Sep 2010 10:23]
Peter Surda
Added missing table in the previous patch
Attachment: mysqldump-speed.patch (text/x-patch), 5.35 KiB.