Bug #36277 Missing and mis-interpreted tokens in tokenizer
Submitted: 23 Apr 2008 8:20 Modified: 13 Oct 2009 17:25
Reporter: Peter Romianowski Email Updates:
Status: Verified Impact on me:
None 
Category:MySQL Proxy: Core Severity:S3 (Non-critical)
Version:0.6.1 and trunk OS:Any
Assigned to: Assigned Account CPU Architecture:Any
Tags: Contribution

File: Maximum allowed size is 50MB.
Description:
Privacy:

If the data you need to attach is more than 50MB, you should create a compressed archive of the data, split it to 50MB chunks, and upload each of them as a separate attachment.

To split a large file:

[23 Apr 2008 8:22] Peter Romianowski
Patch adding missing tokens and unit tests.

Attachment: 36277.patch (text/x-patch), 4.98 KiB.