Bug #70120 mysqldbcompare crashes
Submitted: 22 Aug 2013 12:57 Modified: 8 Sep 2016 13:44
Reporter: D Broeks Email Updates:
Status: No Feedback Impact on me:
None 
Category:MySQL Utilities Severity:S2 (Serious)
Version:1.3.4 OS:Linux (CentOS 6.4)
Assigned to: Chuck Bell CPU Architecture:Any
Tags: mysqldbcompare

[22 Aug 2013 12:57] D Broeks
Description:
After running for a while, the script crashes at a certain table. At that point, it already succesfully analysed 35 other tables. We have 113 tables in total, so we cannot compare 78 of our tables.

Whenever I rerun mysqldbcompare, it keeps crashing on the same database table.

I experienced this on CentOS 6.4, but it might be applicable to other OSses as well.

This is the error I get:

Traceback (most recent call last):
  File "/usr/bin/mysqldbcompare", line 238, in <module>
    db1, db2, options)
  File "/usr/lib/python2.6/site-packages/mysql/utilities/command/dbcompare.py", line 483, in database_compare
    reporter, options)
  File "/usr/lib/python2.6/site-packages/mysql/utilities/command/dbcompare.py", line 318, in _check_data_consistency
    obj1, obj2, options)
  File "/usr/lib/python2.6/site-packages/mysql/utilities/common/dbcompare.py", line 995, in check_consistency
    rows = _get_rows_span(table1, extra1)
  File "/usr/lib/python2.6/site-packages/mysql/utilities/common/dbcompare.py", line 824, in _get_rows_span
    rows.append(res2[0])
IndexError: list index out of range

This is my version info:

MySQL Utilities mysqldbcompare version 1.3.4 (part of MySQL Workbench Distribution 5.2.47)

How to repeat:
This is the command I use:

mysqldbcompare --run-all-tests --span-key-size=8 --server1=<user>:<passwd>@192.168.0.1 --server2=<user>:<passwd>@192.168.0.2 <dbname>

Suggested fix:
As a workaround, a feature to check the database table by table (like mysqldiff does), or skip certain tables would be handy. That way, we can also test the tables after the crashing table.
[22 Aug 2013 15:10] Chuck Bell
If possible, please provide the size and type of the primary key for the table that is being read at the time of the defect. If you cannot tell, please provide any information as to a general layout of the tables. Any information you can provide will help greatly in finding the cause of this defect.
[23 Aug 2013 6:52] D Broeks
CREATE TABLE IF NOT EXISTS `documents` (
  `hash` varchar(255) NOT NULL,
  `fileExt` varchar(10) NOT NULL,
  `dateAdded` datetime NOT NULL,
  `dateDeleted` datetime DEFAULT NULL,
  `nrDownloads` int(11) NOT NULL DEFAULT '0',
  PRIMARY KEY (`hash`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

The table is about 114 MiB large. This is not the biggest, it has processed bigger tables succesfully before it crashes.

The table has about 1.8 million rows. The tables the script succesfully handled before the crash, all have less rows.

The PK is a VARCHAR(255), but the data in this column is never longer than 20 characters.

The InnoDB table has no foreign keys to other tables. Also, there are no other InnoDB tables referencing to this table with a FK.
[28 Apr 2016 2:00] Philip Olson
Changing status to "Open", and assigning to Chuck.
[8 Aug 2016 13:22] Jeremy Kusnetz
I'm on 1.5.6 and getting this error.  Any updates?
[8 Aug 2016 13:44] Chuck Bell
This specific issue was fixed long ago. Indeed, the code represented in the traceback list no longer exists in release-1.5.6. If you are getting the same traceback list, you may have an older version of Utilities installed. However, I suspect you may be getting the same or similar error but a different traceback list.

If so, please post that traceback list so we can see what is going on. We also would like to see the schema for the table that is causing the crash as well as any other pertinent data to help us figure out what went wrong. You can post that on the bug report too.

Finally, a new GA release was announced recently. Please try release-1.6.4 to see if the issue has been fixed.
[9 Sep 2016 1:00] Bugs System
No feedback was provided for this bug for over a month, so it is
being suspended automatically. If you are able to provide the
information that was originally requested, please do so and change
the status of the bug back to "Open".