| Bug #95568 | Memory consumption rapidly increases when using a cursor when it should be level | ||
|---|---|---|---|
| Submitted: | 29 May 2019 15:50 | Modified: | 28 Jul 2019 11:50 |
| Reporter: | Andrew Spode | Email Updates: | |
| Status: | No Feedback | Impact on me: | |
| Category: | Connector / Python | Severity: | S2 (Serious) |
| Version: | 8.0.16 | OS: | Linux (Docker Python3 Image) |
| Assigned to: | Assigned Account | CPU Architecture: | Any |
[28 Jun 2019 11:50]
MySQL Verification Team
Hello Andrew, Thank you for the report and feedback. Could you please provide exact repeatable test case( create table statement, python script, optionally details such MySQL Server version, instance is local or remote etc) to reproduce this issue at our end? Thank you. regards, Umesh
[9 Aug 2019 1:00]
Bugs System
No feedback was provided for this bug for over a month, so it is being suspended automatically. If you are able to provide the information that was originally requested, please do so and change the status of the bug back to "Open".

Description: I am checking nearly 70 million rows (timeseries data). I'm only fetching a single row at a time, memory usage should be at a constant. However, it very quickly increases until it takes over 8GB of memory. This is not present in version 8.0.11 using identical code. How to repeat: for timestamp,value in cursor: #My code here is irrelevant as it's not the issue, it works fine in 8.0.11 pass Suggested fix: I have managed to stop the memory from increasing by the following: cursor = cnx.cursor(buffered=False, raw=True) and then raw decoding: timestamp = float(row[0].decode('utf-8')) value = float(row[1].decode('utf-8')) Unfortunately, I don't know which change made a difference as I did both at the same time. My understanding was/is that buffered defaults to False, so likely something to do with the raw?