Bug #55807 Last_IO_Errno: 1236 and mysqlbinlog unable to read log with --start-position=N
Submitted: 6 Aug 2010 18:14 Modified: 17 Aug 2010 18:23
Reporter: Ben Vins Email Updates:
Status: Not a Bug Impact on me:
None 
Category:MySQL Server: Replication Severity:S1 (Critical)
Version:4.1.22, 5.1.40 OS:Windows
Assigned to: CPU Architecture:Any
Tags: Last_IO_Errno: 1236

[6 Aug 2010 18:14] Ben Vins
Description:
Here is a description of the setup. The DB engine is MyISAM.

Master (Windows server 2003 SE):
#mysql --version
mysql  Ver 14.7 Distrib 4.1.22, for Win32 (ia32)

Slave (Windows XP SP3):
#mysql --version
mysql  Ver 14.14 Distrib 5.1.40, for Win32 (ia32)

Here is the error I am getting:

mysql> show slave status\G
*************************** 1. row ***************************
               Slave_IO_State:
                  Master_Host: 10.xx.xxx.24
                  Master_User: bfReplication
                  Master_Port: 3306
                Connect_Retry: 200
              Master_Log_File: GS2STDB01-bin.000041
          Read_Master_Log_Pos: 79
               Relay_Log_File: myslave-relay-bin.000134
                Relay_Log_Pos: 153
        Relay_Master_Log_File: GS2STDB01-bin.000041
             Slave_IO_Running: No
            Slave_SQL_Running: No
              Replicate_Do_DB: bfresult
          Replicate_Ignore_DB:
           Replicate_Do_Table: bfresult.session_fact_hour,bfresult.bfmediaplexgoals_fact_hour,bfresult.bfpageloadtime_fact_hour,bfresult.campaigngoals_fact_hour,bfresult.browser_fact_hour,bfresult.formfields_fact_hour,bfresult.sitegrip_fact_hour,bfresult.formattempts_fact_hour,bfresult.formsequences_fact_hour,bfresult.pages2_fact_hour,bfresult.sessionmin_fact_5min,bfresult.referrer_fact_hour,bfresult.pages_fact_hour,bfresult.formvalues_fact_hour,bfresult.SEARCHENGINE_FACT_HOUR,bfresult.popularobjects_fact_hour
       Replicate_Ignore_Table:
      Replicate_Wild_Do_Table: bfresult.st_%
  Replicate_Wild_Ignore_Table:
                   Last_Errno: 0
                   Last_Error:
                 Skip_Counter: 0
          Exec_Master_Log_Pos: 79
              Relay_Log_Space: 314
              Until_Condition: None
               Until_Log_File:
                Until_Log_Pos: 0
           Master_SSL_Allowed: No
           Master_SSL_CA_File:
           Master_SSL_CA_Path:
              Master_SSL_Cert:
            Master_SSL_Cipher:
               Master_SSL_Key:
        Seconds_Behind_Master: NULL
Master_SSL_Verify_Server_Cert: No
                Last_IO_Errno: 1236
                Last_IO_Error: Got fatal error 1236 from master when reading data from binary log: 'bogus data in log event'
               Last_SQL_Errno: 0
               Last_SQL_Error:

As a first step to solve this problem, I am trying to read the binary log as follows:
#mysqlbinlog  --read-from-remote-server -uroot  -p GS2STDB01-bin.000041 -h 10.xx.xxx.24 -P 3306

/*!40019 SET @@session.max_insert_delayed_threads=0*/;
/*!50003 SET @OLD_COMPLETION_TYPE=@@COMPLETION_TYPE,COMPLETION_TYPE=0*/;
DELIMITER /*!*/;
# at 4
#100804  4:09:08 server id 1  end_log_pos 79    Start: binlog v 3, server v 4.1.22-community-nt-log created 100804  4:09:08 at startup
ROLLBACK/*!*/;
BINLOG '
1NlYTAEBAAAASwAAAAQAAAAAAAMANC4xLjIyLWNvbW11bml0eS1udC1sb2cAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAADU2VhM
'/*!*/;
ERROR: Got error reading packet from server: bogus data in log event
DELIMITER ;
# End of log file
ROLLBACK /* added by mysqlbinlog */;
/*!50003 SET COMPLETION_TYPE=@OLD_COMPLETION_TYPE*/;

I was hopping to skip the bad block as indicated here:
http://www.softwareprojects.com/resources/programming/t-how-to-recover-from-mysql-replicat...

Trying with from another block just fails

#mysqlbinlog  --read-from-remote-server -uroot  -pb3Tf4!re GS2STDB01-bin.000041 -h 10.xx.xxx.24 -P 3306 --start-position=90
/*!40019 SET @@session.max_insert_delayed_threads=0*/;
/*!50003 SET @OLD_COMPLETION_TYPE=@@COMPLETION_TYPE,COMPLETION_TYPE=0*/;
DELIMITER /*!*/;
ERROR: Got error reading packet from server: log event entry exceeded max_allowed_packet; Increase max_allowed_packet on master
DELIMITER ;
# End of log file
ROLLBACK /* added by mysqlbinlog */;
/*!50003 SET COMPLETION_TYPE=@OLD_COMPLETION_TYPE*/;

Actually, even trying to read from a pre-defined block on an uncorrupted binlog fails:

#mysqlbinlog  --read-from-remote-server -uroot  -pb3Tf4!re GS2STDB01-bin.000040 -h 10.xx.xxx.24 -P 3306 --start-position=90
/*!40019 SET @@session.max_insert_delayed_threads=0*/;
/*!50003 SET @OLD_COMPLETION_TYPE=@@COMPLETION_TYPE,COMPLETION_TYPE=0*/;
DELIMITER /*!*/;
ERROR: Got error reading packet from server: log event entry exceeded max_allowed_packet; Increase max_allowed_packet on master
DELIMITER ;
# End of log file
ROLLBACK /* added by mysqlbinlog */;
/*!50003 SET COMPLETION_TYPE=@OLD_COMPLETION_TYPE*/;

Note that the following command will run successfully:
#mysqlbinlog  --read-from-remote-server -uroot  -pb3Tf4!re GS2STDB01-bin.000040 -h 10.xx.xxx.24 -P 3306

In both the master and the slave .ini file, I hadded below the line "[mysqld]":
max_allowed_packet=32M

Any advice? i know that my master is very old, but this is part of a propriatery software bundel which the vendore does not support the mysql install, so I am on my own.

I have re-started the slave a few times already by copying the tables manually from master, but I would like to find a solution once and for all.

Thanks for any advice/help.

How to repeat:
I can provide the binlog is requested but it is 1G
[9 Aug 2010 18:17] Sveta Smirnova
Thank you for the report.

I can not repeat described behavior with test binary log. Please upload us problem binary log as described in "Files" section. Can you repeat same issue with smaller binary log files? If so, please send us smaller binary log.
[12 Aug 2010 9:32] Ben Vins
Ok, just got the problem once more. My bin log is only 57Meg but it is still too big to upload to this bug report. Is there any other way to give you this data?
[12 Aug 2010 16:59] Sveta Smirnova
Thank you for the feedback.

As described in "Files" section:

If the data you need to attach is more than 500KB, you should create a compressed archive of the data and a README file that describes the data with a filename that includes the bug number (example: bug-data-55807.zip), and use FTP to upload the archive to ftp://ftp.mysql.com/pub/mysql/upload/. Once you have uploaded the file, add a comment to this bug to notify us about it. Note: This directory is unlistable, which means that once you have uploaded your file, you will not be able to see it.
[13 Aug 2010 10:20] Ben Vins
Hi Sveta,

Before I send you the binlog file, could you confirm I am sending the correct one?

Basically, I am confused to what is happing. Something seems to have happen which made master stop inserting into the binlog 00005 before reaching the 1G limit and create the 00006 one. I would expect to see the replication broken on the slave whilst reading the binlog 00005 and not the first entry (position 79) on binlog 00006.
Binlog 00006 does not seems that badly corrupted as Master is carrying on writing to it up to the 1G limit and then created the binlog 00007 etc.. and is still running.

Do you have an idea to what is happening? Do you want GS2STDB01-bin.000005 or GS2STDB01-bin.000006?

Here is what I am seeing:
GS2STDB01-bin.000001  -> 1,048,585 KB
GS2STDB01-bin.000002  -> 1,048,585 KB
GS2STDB01-bin.000003  -> 1,048,585 KB
GS2STDB01-bin.000004  -> 1,048,585 KB
GS2STDB01-bin.000005  -> 357,786 KB
GS2STDB01-bin.000006  -> 1,048,585 KB
GS2STDB01-bin.000007  -> 1,048,585 KB
etc...

I pick up the GS2STDB01-bin.000006 file when it was still only 56.4Mb but as the master is live, it has carrying on growing.

The slave shows 
mysql> show slave status\G

*************************** 1. row ***************************
               Slave_IO_State:
                  Master_Host: 10.xx.xxx.24
                  Master_User: bfReplication
                  Master_Port: 3306
                Connect_Retry: 200
              Master_Log_File: GS2STDB01-bin.000006
          Read_Master_Log_Pos: 79
               Relay_Log_File: slave-relay-bin.000017
                Relay_Log_Pos: 228
        Relay_Master_Log_File: GS2STDB01-bin.000006
             Slave_IO_Running: No
            Slave_SQL_Running: Yes
              Replicate_Do_DB: bfresult
          Replicate_Ignore_DB:
           Replicate_Do_Table: bfresult.session_fact_hour,bfresult.bfmediaplexgoals_fact_hour,bfresult.bfpageloadtime_fact_hour,bfresult.campaigngoals_fact_hour,bfresult.browser_fact_hour,bfresult.formfields_fact_hour,bfresult.sitegrip_fact_hour,bfresult.formattempts_fact_hour,bfresult.formsequences_fact_hour,bfresult.pages2_fact_hour,bfresult.sessionmin_fact_5min,bfresult.referrer_fact_hour,bfresult.pages_fact_hour,bfresult.formvalues_fact_hour,bfresult.SEARCHENGINE_FACT_HOUR,bfresult.popularobjects_fact_hour
       Replicate_Ignore_Table:
      Replicate_Wild_Do_Table: bfresult.st_%
  Replicate_Wild_Ignore_Table:
                   Last_Errno: 0
                   Last_Error:
                 Skip_Counter: 0
          Exec_Master_Log_Pos: 79
              Relay_Log_Space: 436
              Until_Condition: None
               Until_Log_File:
                Until_Log_Pos: 0
           Master_SSL_Allowed: No
           Master_SSL_CA_File:
           Master_SSL_CA_Path:
              Master_SSL_Cert:
            Master_SSL_Cipher:
               Master_SSL_Key:
        Seconds_Behind_Master: NULL
Master_SSL_Verify_Server_Cert: No
                Last_IO_Errno: 1236
                Last_IO_Error: Got fatal error 1236 from master when reading data from binary log: 'bogus data in log event'
               Last_SQL_Errno: 0
               Last_SQL_Error:
1 row in set (0.30 sec)
[13 Aug 2010 19:06] Sveta Smirnova
Thank you for the feedback.

I want any binary log mysqlbinlog has problem reading with. Better if it is closed (not used by master now) and smaller binary log.
[17 Aug 2010 11:34] Ben Vins
Hi Sveta,

I have now upload a corrupted binlog to your ftp. Please let me know if you need anything else.

regards,

Ben
[17 Aug 2010 18:23] Sveta Smirnova
Thank you for the feedback.

But binary log is really corrupted and can not be read even by mysqlbinlog from 4.1 installation.

Therefore this is not a bug. max_allowed_packet can be set up to 1G. As workaround you can get queries using `strings` UNIX command, then edit that file manually to be able to load data into slave.