Bug #79457 | Memory consumtion and request errors during table scan | ||
---|---|---|---|
Submitted: | 30 Nov 2015 13:51 | Modified: | 19 Mar 2020 17:20 |
Reporter: | Mike Pentagra | Email Updates: | |
Status: | No Feedback | Impact on me: | |
Category: | MySQL Cluster: NDB API | Severity: | S5 (Performance) |
Version: | 7.4.7 | OS: | Oracle Linux |
Assigned to: | MySQL Verification Team | CPU Architecture: | Any |
If the data you need to attach is more than 50MB, you should create a compressed archive of the data, split it to 50MB chunks, and upload each of them as a separate attachment.
To split a large file:
- On *nix platforms use the split command e.g.
split -b 50MB <my_large_archive> <my_split_archive_prefix>
- On windows use WinZip or a similar utility to split the large file
[30 Nov 2015 13:53]
Mike Pentagra
[30 Nov 2015 13:53]
Mike Pentagra
sample tables and some to reporoduce the problem
Attachment: ndb_ty.cpp (application/octet-stream, text), 5.67 KiB.