Description:
Hello,
Recently we find a case that "row size too large" error is reported while executing "insert into select" operation, which is unexpected:
1. the table is consist of 43 char(50) fields, and each field has the same value "AAAAAAAAAAAAAAAAAAAA" (20 characters), so each record has the same row size: 2150 bytes;
2. after insert 4 records into the table, the first 6 times "insert into select" statements succeed, but later "insert into select" staments fail. Because each record has the same length, this behavior shows inconsistency.
3. After increasing temptable_max_ram and tmp_table_size as bigger value, "insert into select" statements succeed again. But with bigger amount of data, "insert into select" statements will fail eventually.
How to repeat:
[yxxdb@localhost ~]$ mysql -uroot -p'db1x@NJ+1' -S bin/mysql1.sock
mysql: [Warning] Using a password on the command line interface can be insecure.
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 8
Server version: 8.0.36 Source distribution
Copyright (c) 2000, 2024, Oracle and/or its affiliates.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
mysql>
mysql>
mysql>
mysql> create database row_size;
Query OK, 1 row affected (0.00 sec)
mysql> use row_size
Database changed
mysql>
mysql> set global internal_tmp_mem_storage_engine = TempTable;
Query OK, 0 rows affected (0.00 sec)
mysql> set global temptable_max_ram = 2097152;
Query OK, 0 rows affected (0.00 sec)
mysql> set global tmp_table_size = 2097152;
Query OK, 0 rows affected (0.00 sec)
mysql> set global temptable_use_mmap = OFF;
Query OK, 0 rows affected, 1 warning (0.00 sec)
mysql> create table t1 (c1 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c2 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c3 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c4 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c5 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c6 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c7 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c8 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c9 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c10 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c11 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c12 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c13 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c14 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c15 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c16 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c17 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c18 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c19 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c20 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c21 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c22 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c23 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c24 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c25 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c26 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c27 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c28 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c29 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c30 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c31 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c32 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c33 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c34 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c35 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c36 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c37 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c38 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c39 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c40 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c41 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c42 char(50) default 'AAAAAAAAAAAAAAAAAAAA',c43 char(50) default 'AAAAAAAAAAAAAAAAAAAA') charset utf8mb4;
Query OK, 0 rows affected (0.01 sec)
mysql> insert into t1 () values ();
Query OK, 1 row affected (0.01 sec)
mysql> insert into t1 () values ();
Query OK, 1 row affected (0.00 sec)
mysql> insert into t1 () values ();
Query OK, 1 row affected (0.00 sec)
mysql> insert into t1 () values ();
Query OK, 1 row affected (0.00 sec)
mysql> insert into t1 select * from t1;
Query OK, 4 rows affected (0.00 sec)
Records: 4 Duplicates: 0 Warnings: 0
mysql> insert into t1 select * from t1;
Query OK, 8 rows affected (0.00 sec)
Records: 8 Duplicates: 0 Warnings: 0
mysql> insert into t1 select * from t1;
Query OK, 16 rows affected (0.00 sec)
Records: 16 Duplicates: 0 Warnings: 0
mysql> insert into t1 select * from t1;
Query OK, 32 rows affected (0.00 sec)
Records: 32 Duplicates: 0 Warnings: 0
mysql> insert into t1 select * from t1;
Query OK, 64 rows affected (0.01 sec)
Records: 64 Duplicates: 0 Warnings: 0
mysql> insert into t1 select * from t1;
Query OK, 128 rows affected (0.01 sec)
Records: 128 Duplicates: 0 Warnings: 0
mysql> insert into t1 select * from t1;
ERROR 1118 (42000): Row size too large (> 8126). Changing some columns to TEXT or BLOB may help. In current row format, BLOB prefix of 0 bytes is stored inline.
mysql> insert into t1 select * from t1;
ERROR 1118 (42000): Row size too large (> 8126). Changing some columns to TEXT or BLOB may help. In current row format, BLOB prefix of 0 bytes is stored inline.
mysql>
mysql>
mysql>
mysql>
mysql> set global tmp_table_size = 10485760;
Query OK, 0 rows affected (0.00 sec)
mysql> set global temptable_max_ram = 10485760;
Query OK, 0 rows affected (0.00 sec)
mysql> insert into t1 select * from t1;
Query OK, 256 rows affected (0.01 sec)
Records: 256 Duplicates: 0 Warnings: 0
mysql> insert into t1 select * from t1;
Query OK, 512 rows affected (0.01 sec)
Records: 512 Duplicates: 0 Warnings: 0
mysql> insert into t1 select * from t1;
ERROR 1118 (42000): Row size too large (> 8126). Changing some columns to TEXT or BLOB may help. In current row format, BLOB prefix of 0 bytes is stored inline.
mysql> insert into t1 select * from t1;
ERROR 1118 (42000): Row size too large (> 8126). Changing some columns to TEXT or BLOB may help. In current row format, BLOB prefix of 0 bytes is stored inline.
Suggested fix:
Insert into a innodb table directly will succeed, but convert a memory table (temptable SE) into a innodb table fails at insert the same data, acctually success is expected.