Bug #82330 Don't recursively-evaluate stopword after tokenize
Submitted: 25 Jul 2016 5:55 Modified: 6 Apr 9:15
Reporter: Tsubasa Tanaka (OCA) Email Updates:
Status: Verified Impact on me:
None 
Category:MySQL Server: FULLTEXT search Severity:S1 (Critical)
Version:5.7.13, 5.7.22, 8.0.16, 8.3.0 OS:CentOS (6.6)
Assigned to: CPU Architecture:Any
Tags: fulltext, NGRAM

File: Maximum allowed size is 50MB.
Description:
Privacy:

If the data you need to attach is more than 50MB, you should create a compressed archive of the data, split it to 50MB chunks, and upload each of them as a separate attachment.

To split a large file: