Bug #54367 | Prepared statement batch insertion loss | ||
---|---|---|---|
Submitted: | 9 Jun 2010 13:44 | Modified: | 17 Sep 2012 8:10 |
Reporter: | RICHARDS PETER | Email Updates: | |
Status: | Not a Bug | Impact on me: | |
Category: | Connector / J | Severity: | S3 (Non-critical) |
Version: | 5.1.41-community | OS: | Linux (ubuntu 2.6.28-11-generic) |
Assigned to: | Alexander Soklakov | CPU Architecture: | Any |
If the data you need to attach is more than 50MB, you should create a compressed archive of the data, split it to 50MB chunks, and upload each of them as a separate attachment.
To split a large file:
- On *nix platforms use the split command e.g.
split -b 50MB <my_large_archive> <my_split_archive_prefix>
- On windows use WinZip or a similar utility to split the large file
[10 Jun 2010 9:42]
RICHARDS PETER
[10 Jun 2010 12:14]
RICHARDS PETER
I am attaching a java code with not much threading that can still produce the issue. Let me know where I went wrong.
Attachment: PreInsert.java (text/x-java), 7.48 KiB.
[14 Jun 2010 5:59]
RICHARDS PETER
Modified code with synchronization that solved the issue.
Attachment: PreInsert.java (text/x-java), 7.65 KiB.