Bug #50552 | ndb_size.pl fails to calculate utf8 fields correctly | ||
---|---|---|---|
Submitted: | 22 Jan 2010 17:18 | Modified: | 16 Apr 2012 11:52 |
Reporter: | Patrick Mulvany (OCA) | Email Updates: | |
Status: | In review | Impact on me: | |
Category: | MySQL Cluster: Cluster (NDB) storage engine | Severity: | S3 (Non-critical) |
Version: | mysql-5.1-telco-7.0, 5.5.20-ndb-7.2.5 | OS: | Any |
Assigned to: | CPU Architecture: | Any | |
Tags: | 7.0.9, Contribution |
If the data you need to attach is more than 50MB, you should create a compressed archive of the data, split it to 50MB chunks, and upload each of them as a separate attachment.
To split a large file:
- On *nix platforms use the split command e.g.
split -b 50MB <my_large_archive> <my_split_archive_prefix>
- On windows use WinZip or a similar utility to split the large file
[26 Jan 2010 15:21]
Patrick Mulvany
[11 Feb 2010 11:41]
Patrick Mulvany
Updated patch to add --default-character-set=utf8|utf2|utf16 option, now detects column charsets, removed defunct --uft8 option
Attachment: ndb_size.pl.patch (application/octet-stream, text), 6.68 KiB.
[11 Feb 2010 14:12]
Patrick Mulvany
Full fix handles any character set pulling length from information_schema with backout using fixed values for utf8,ucs2 7 utf16
Attachment: ndb_size.pl.patch (application/octet-stream, text), 6.97 KiB.
[26 Mar 2010 18:09]
Frazer Clement
Modified version of latest patch
Attachment: bug#50522.patch (text/x-patch), 9.23 KiB.