Bug #9850 segmentation fault after insert of 800000
Submitted: 12 Apr 2005 16:15 Modified: 7 May 2005 11:00
Reporter: Donald Huebschman Email Updates:
Status: Can't repeat Impact on me:
None 
Category:MySQL Server: Command-line Clients Severity:S3 (Non-critical)
Version:5.0.2a standard OS:MacOS (OSX 10.2.8)
Assigned to: Ingo Strüwing CPU Architecture:Any

[12 Apr 2005 16:15] Donald Huebschman
Description:
Segmentation fault encountered, EXC_BAD_ACCESS(0x0001), using a dump from a 4.0.18 table with 1,110,000 entries, importing into a 5.0.2a. The first attempt failed at 830,000 and the second at 980,000. Did not notice any increase in memory usage in mysql. Plain mysqldump of table without options.

Mac G3 OSX 10.2.8, 640Mb

How to repeat:
Dump a table with more than a million rows and try to reload it.

Suggested fix:
Add a file size/number of inserts limit to mysqldump that will increment a given file name.
[20 Apr 2005 16:18] Jorge del Conde
Verified w/3mil records
[7 May 2005 11:00] Ingo Strüwing
Not enough information was provided for us to be able
to handle this bug. Please re-read the instructions at
http://bugs.mysql.com/how-to-report.php

If you can provide more information, feel free to add it
to this bug and change the status back to 'Open'.

Thank you for your interest in MySQL.

Additional info:

Jorge, I wish you had added the following:
1. It cannot be reproduced on Linux (I had to try it myself)
2. The dump file (I had to create a table in 4.0 and fill it to create a dump file myself)
3. A mention, on which machine you verified it, which package file you used, and ideally where you left the stuff over for use by the developer (I tried to repeat the case on powermacg4 and powermacg5, built a 5.0.6-debug server and tested it. It just works. No crash at all. All 3m rows were inserted).