Description:
One of my table contains one blob column, and sometimes the size of thiscolumn value would be bigger than 500M, which is larger than the defalut max heap size, e.g. -Xmx is configured to 512M in my Application Server (JBoss).
In my code, I have set the fetch size of PreparedStatement to Integer.MIN_VALUE to utilize the Mysql Stream support. Meanwhile, I add "useServerPrepStmts=true" into the database url.
With these settings, I can insert record with large column (>500M) successfully.
But when I trying to get the record, I always hit OutOfMemoryError when calling ResultSet.next().
ResultRet rs = Preparedstatement.execute(...).
rs.next();
Note: execute statement can pass, checked with the heap size change, it looks that Preparedstatement.execute doesn't load the real record data into memory. Instead, ResultSet.next() is trying to load the column. No matter which call does really to load the data, for Blob type of data, it is supposed to read by stream as below, which means we should not load the whole column value into the memory, which would cause OutOfMemoryError. As far as I know, for blob data, Oracle supports read its value as stream, then NO OutOfMemoryError issue for Oracle.
InputStream stream = rs.getBinaryStream(name);
Below is the exception stack.
13-09-18 13:18:00,169 ERROR [STDERR] Caused by: java.lang.OutOfMemoryError: Java heap space
2013-09-18 13:18:00,169 ERROR [STDERR] at com.mysql.jdbc.Buffer.ensureCapacity(Buffer.java:156)
2013-09-18 13:18:00,169 ERROR [STDERR] at com.mysql.jdbc.Buffer.writeBytesNoNull(Buffer.java:514)
2013-09-18 13:18:00,169 ERROR [STDERR] at com.mysql.jdbc.MysqlIO.readRemainingMultiPackets(MysqlIO.java:3219)
2013-09-18 13:18:00,169 ERROR [STDERR] at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:3077)
2013-09-18 13:18:00,169 ERROR [STDERR] at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2979)
2013-09-18 13:18:00,169 ERROR [STDERR] at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3520)
2013-09-18 13:18:00,169 ERROR [STDERR] at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:935)
2013-09-18 13:18:00,169 ERROR [STDERR] at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1433)
2013-09-18 13:18:00,169 ERROR [STDERR] at com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:416)
2013-09-18 13:18:00,169 ERROR [STDERR] at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:395)
2013-09-18 13:18:00,169 ERROR [STDERR] at com.mysql.jdbc.ResultSetImpl.next(ResultSetImpl.java:7165)
2013-09-18 13:18:00,169 ERROR [STDERR] at org.apache.commons.dbcp.DelegatingResultSet.next(DelegatingResultSet.java:169)
2013-09-18 13:18:00,169 ERROR [STDERR] at org.apache.commons.dbcp.DelegatingResultSet.next(DelegatingResultSet.java:169)
2013-09-18 13:18:00,169 ERROR [STDERR] at sun.reflect.GeneratedMethodAccessor86.invoke(Unknown Source)
2013-09-18 13:18:00,169 ERROR [STDERR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
2013-09-18 13:18:00,169 ERROR [STDERR] at java.lang.reflect.Method.invoke(Method.java:597)
2013-09-18 13:18:00,169 ERROR [STDERR] at org.apache.jackrabbit.core.util.db.ResultSetWrapper.invoke(ResultSetWrapper.java:66)
2013-09-18 13:18:00,169 ERROR [STDERR] at $Proxy68.next(Unknown Source)
2013-09-18 13:18:00,169 ERROR [STDERR] at org.apache.jackrabbit.core.data.db.DbDataStore.openStream(DbDataStore.java:542)
2013-09-18 13:18:00,169 ERROR [STDERR] at org.apache.jackrabbit.core.data.db.DbInputStream.openStream(DbInputStream.java:68)
2013-09-18 13:18:00,169 ERROR [STDERR] at org.apache.jackrabbit.core.data.db.DbInputStream.read(DbInputStream.java:110)
How to repeat:
Create one table, add one Blob column for it.
Insert one record with big column value, that is bigger than 500M. After that, record it with
ResultRet rs = Preparedstatement.execute("Select blobColumn from table").
rs.next();
InputStream stream = rs.getBinaryStream("blobColumn");
Suggested fix:
I wish for Blob data, Mysql would not load the whole column value into memory at a time. Just works as Oracle, to support the steaming read.