Description:
Hope someone can help, this is my first post.
Im getting an out of memory error trying to store large blobs in a mysql database. I think I have tracked the issue down to connector j.
I have code like this to set the blob field with a java.io.InputStream:
InputStream fstream = new FileInputStream("C:\\c.zip");
int a = fstream.available();
m_PreStmt.setBinaryStream(5,fstream,a);
When the file size gets over 100meg, or 200 meg; I get an out of memory error.
My system has over a gig of memory, and I have the JVM set to 500meg.
My understanding of setting a blob to an inputstream is that this should avoid the
whole (chunk of data) being loaded into memory at once... that the stream would
be read a little at a time.
I have tried several versions of connector j 3.0.11, 3.0.1.5.ga, and 3.1.4-beta, all give the same out of memory error.
In the source for connector j 3.1.4-beta the error occurs in the method ensureCapacity(int additionalData)
The line of code is: byte[] newBytes = new byte[newLength];
It seems that the entire chunk of memory is being allocated?
Can anyone offer any help?
also this seems to be the same type problem as reported in bug # 1605
http://bugs.mysql.com/bug.php?id=1605
Thank You!
final void ensureCapacity(int additionalData) throws SQLException {
if ((this.position + additionalData) > getBufLength()) {
if ((this.position + additionalData) < this.byteBuffer.length) {
// byteBuffer.length is != getBufLength() all of the time
// due to re-using of packets (we don't shrink them)
//
// If we can, don't re-alloc, just set buffer length
// to size of current buffer
setBufLength(this.byteBuffer.length);
} else {
//
// Otherwise, re-size, and pad so we can avoid
// allocing again in the near future
//
int newLength = (int) (this.byteBuffer.length * 1.25);
if (newLength < (this.byteBuffer.length + additionalData)) {
newLength = this.byteBuffer.length
+ (int) (additionalData * 1.25);
}
if (newLength < this.byteBuffer.length) {
newLength = this.byteBuffer.length + additionalData;
}
---> byte[] newBytes = new byte[newLength];
System.arraycopy(this.byteBuffer, 0, newBytes, 0,
this.byteBuffer.length);
this.byteBuffer = newBytes;
setBufLength(this.byteBuffer.length);
}
}
}
How to repeat:
Set a blob field to a large file (200 meg or more)
like this
InputStream fstream = new FileInputStream("C:\\c.zip");
int a = fstream.available();
m_PreStmt.setBinaryStream(5,fstream,a);
Try inserting the row into a table.
You will get an out of memory error (I have my JVM set to 500 meg)
Suggested fix:
This is the mthod causing the out of memory error in connector j
I copied this from 3.1.14-beta version, but the same proble exists in 3.0.11 and 3.0.15.ga
The specific line of code is: byte[] newBytes = new byte[newLength];
final void ensureCapacity(int additionalData) throws SQLException {
if ((this.position + additionalData) > getBufLength()) {
if ((this.position + additionalData) < this.byteBuffer.length) {
// byteBuffer.length is != getBufLength() all of the time
// due to re-using of packets (we don't shrink them)
//
// If we can, don't re-alloc, just set buffer length
// to size of current buffer
setBufLength(this.byteBuffer.length);
} else {
//
// Otherwise, re-size, and pad so we can avoid
// allocing again in the near future
//
int newLength = (int) (this.byteBuffer.length * 1.25);
if (newLength < (this.byteBuffer.length + additionalData)) {
newLength = this.byteBuffer.length
+ (int) (additionalData * 1.25);
}
if (newLength < this.byteBuffer.length) {
newLength = this.byteBuffer.length + additionalData;
}
---> byte[] newBytes = new byte[newLength];
System.arraycopy(this.byteBuffer, 0, newBytes, 0,
this.byteBuffer.length);
this.byteBuffer = newBytes;
setBufLength(this.byteBuffer.length);
}
}
}