Bug #50503 | Read() Loop not completing without error when iteration is slow. | ||
---|---|---|---|
Submitted: | 21 Jan 2010 11:56 | Modified: | 26 Oct 2011 17:44 |
Reporter: | baileys walk | Email Updates: | |
Status: | Not a Bug | Impact on me: | |
Category: | Connector / NET | Severity: | S3 (Non-critical) |
Version: | 6.1.2 | OS: | Windows (XP Pro) |
Assigned to: | Reggie Burnett | CPU Architecture: | Any |
Tags: | Connector/Net, loop, Read() |
[21 Jan 2010 11:56]
baileys walk
[27 Jan 2010 8:52]
Tonci Grgin
Hi and thanks for your report. I think what should be done is following: o Be clear about the nature of problem. Do I absolutely need two worker threads to expose it? If so, would 20 worker threads show the problem quicker? o Be sure to check disk (RAID)/RAM for errors. This could simply mean you have bad sector somewhere. o Do not jump to conclusions. If this problem is *not* repeatable with any other table, then I'm inclined to think it's table specific. What is the table creation statement. What do SHOW TABLE STATUS and CHECK TABLE say? o If there is a connector/NET problem here, with slow loops, can it be reproduced without specific table, ie. with long running queries such as SELECT SLEEP()?
[28 Feb 2010 0:00]
Bugs System
No feedback was provided for this bug for over a month, so it is being suspended automatically. If you are able to provide the information that was originally requested, please do so and change the status of the bug back to "Open".
[26 Oct 2011 17:44]
Reggie Burnett
This is not a bug. Two things could be happening here. First, the server has read/write timeouts that cause the server to disconnect if the client is non-responsive for a long time. Second, in our 6.x branches we are now using the SQL Server mode of command timeout meaning that we count read/write wait times during the entire operation and the command can timeout during a read loop. However, this should throw a timeout exception. If you are looping through a large number of records and processing of each record could be lengthy, it is best practice to load those records into a dataset (or some other structure) for processing.