Bug #82330 | Don't recursively-evaluate stopword after tokenize | ||
---|---|---|---|
Submitted: | 25 Jul 2016 5:55 | Modified: | 6 Apr 9:15 |
Reporter: | Tsubasa Tanaka (OCA) | Email Updates: | |
Status: | Verified | Impact on me: | |
Category: | MySQL Server: FULLTEXT search | Severity: | S1 (Critical) |
Version: | 5.7.13, 5.7.22, 8.0.16, 8.3.0 | OS: | CentOS (6.6) |
Assigned to: | CPU Architecture: | Any | |
Tags: | fulltext, NGRAM |
If the data you need to attach is more than 50MB, you should create a compressed archive of the data, split it to 50MB chunks, and upload each of them as a separate attachment.
To split a large file:
- On *nix platforms use the split command e.g.
split -b 50MB <my_large_archive> <my_split_archive_prefix>
- On windows use WinZip or a similar utility to split the large file