Bug #104935 JSON_TABLE function report duplicate column name
Submitted: 14 Sep 2021 8:12 Modified: 16 Sep 2021 20:21
Reporter: Yin Peng (OCA) Email Updates:
Status: Duplicate Impact on me:
None 
Category:MySQL Server: JSON Severity:S3 (Non-critical)
Version:8.0.25, 8.0.26 OS:Any
Assigned to: CPU Architecture:Any

[14 Sep 2021 8:12] Yin Peng
Description:
JSON_TABLE function only compares the first 64 bytes of column name instead of 64 characters, so when column with mbchar, the function may report an error.

How to repeat:
select * from json_table('[ {"c1": 199,"c2":200} ]','$[*]' COLUMNS( `测试列aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa` INT PATH '$.c1' ERROR ON ERROR,`测试列aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaabb` int path '$.c2' error on error)) as jt;

Suggested fix:
diff --git a/sql/table_function.cc b/sql/table_function.cc
index 3af81dba53b..7106c72b69b 100644
--- a/sql/table_function.cc
+++ b/sql/table_function.cc
@@ -302,7 +302,7 @@ bool Table_function_json::init() {
       List_iterator<Json_table_column> li2(m_vt_list);
       // Compare 'first' with all columns prior to it
       while ((col = li2++) && col != first) {
-        if (!strncmp(first->field_name, col->field_name, NAME_CHAR_LEN)) {
+        if (!strncmp(first->field_name, col->field_name, NAME_LEN)) {
           my_error(ER_DUP_FIELDNAME, MYF(0), first->field_name);
           return true;
         }
[14 Sep 2021 8:15] Yin Peng
use NAME_LEN instead NAME_CHAR_LEN in function Table_function_json::init

(*) I confirm the code being submitted is offered under the terms of the OCA, and that I am authorized to contribute it.

Contribution: fix.txt (text/plain), 826 bytes.

[14 Sep 2021 8:25] Yin Peng
sorry, use NAME_LEN instead of NAME_CHAR_LEN in function Table_function_json::init
[14 Sep 2021 9:11] MySQL Verification Team
Hello yin peng,

Thank you for the report and contribution.

regards,
Umesh
[16 Sep 2021 20:21] Jon Stephens
Duplicate of BUG#102824, which is fixed in MySQL 8.0.27.
[25 Jan 2022 17:13] Frederic Descamps
Thank you for your contribution, however, this was already fixed differently internally.

Cheers,