Bug #5974 Exception "trying to fetch row:null"
Submitted: 8 Oct 2004 9:46 Modified: 26 Oct 2004 0:12
Reporter: Alexander Flödl Email Updates:
Status: Won't fix Impact on me:
None 
Category:Connector / J Severity:S3 (Non-critical)
Version:4.0.21 OS:United Linux
Assigned to: Mark Matthews CPU Architecture:Any

[8 Oct 2004 9:46] Alexander Flödl
Description:
i have the same problem, as descripted in bug#736 in mysql version 0.4.21
we are developing a software running on a pc system mit 233 Mhz oder 533 Mhz with 64MB
memory. 

i'm using one connection for streaming data from one table and another connection to make all other statements (eg. Update)

after reading 2000 Records i'm getting the error "trying to fetch row:null" and the resultset is closed.

in each readed record a field will be updated (state value) for marking this record as processed.
this is make after reading 100 records for the last 100 processed records.

How to repeat:
in the following the individual program steps:
1. read 100 records per streaming from the table A, make a object from the record and add it to an ArrayList
2. process each object. first update a field in another table B, then update the state of the record in table A
3. process with the next 100 records, and so one

Suggested fix:
i have written a workaround. if this error occures, i create a new statement,
muss this costs alot time on such a low powered system
[18 Oct 2004 20:07] Mark Matthews
The error message you posted does not appear to come from our driver, but from application code.

Could you please post the _full_ stack trace from your exception, as well as the version of the _JDBC_ driver you're using (the server version doesn't matter in this case).
[25 Oct 2004 16:44] Alexander Flödl
i have changed the method for reading and updating the records:

- first i read <n> records from the streaming resultset (table A)
- next i'm updating the records in table B with the data from the stream.
- create a batch update statement to change the state in table A
- after <n2> records the streaming resultset will be aborted and closed 
   and the batch statements will be executed
- after that, i'm reopening the streaming resultset with the rest of the records.

proceeding until all records are updated.

this works for me.

sorry we don't have much time to follow this error anymore, but it seams
that reading from a streaming resultset with updating records in the same table
causes this  error.
[26 Oct 2004 0:12] Mark Matthews
This can't be fixed if you're using MyISAM unless you use SQL_BIG_RESULTS as a flag to your query (to cause MySQL to buffer the result in a temporary table). There is nothing the JDBC driver can do on your behalf, as MyISAM is _not_ transactional, and doesn't guarantee consistent reads. If you want consistent reads, you need to use InnoDB type tables.