| Bug #108365 | JSON_TABLE fails to parse when used in LATERAL join | ||
|---|---|---|---|
| Submitted: | 1 Sep 2022 19:15 | Modified: | 2 Sep 2022 3:55 |
| Reporter: | Justin Naifeh | Email Updates: | |
| Status: | Verified | Impact on me: | |
| Category: | MySQL Server: JSON | Severity: | S3 (Non-critical) |
| Version: | 8.0.30 | OS: | Any |
| Assigned to: | CPU Architecture: | Any | |
| Tags: | json_table | ||
[2 Sep 2022 3:55]
MySQL Verification Team
Hello Justin, Thank you for the report and test case. Verified as described. regards, Umesh
[4 Sep 2022 8:20]
huahua xu
Hi Justin,
You could avoid this problem by modifying options of optimizer.
mysql> set optimizer_switch="derived_merge=off";
Query OK, 0 rows affected (0.00 sec)
mysql> select l.val from (
-> select json_arr from stuff
-> ) s,
-> lateral (
-> select
-> val
-> from json_table(
-> s.json_arr,
-> '$[*]'
-> columns (
-> val char(24) path '$'
-> )
-> ) i
-> ) l;
+------+
| val |
+------+
| a |
| b |
| c |
+------+
3 rows in set (0.00 sec)
In the initialization phase of the function `Table_function_json`, it can not resolve the column reference(JSON_TABLE's data source expression) and raises error, which is the reason that the optimizer merges derived tables or views into outer query block.
[2 Oct 2023 19:05]
Rayne Gardner
Multiple chained CTEs also causes this problem. For example, the following works:
WITH t1 (col1, col2) AS (
VALUES ROW(1,'["value 1","value 2"]')
), t2 (col1, col2) AS (
SELECT * FROM t1
LIMIT 100
)
SELECT t2.col1, jt.col2_new FROM t2
CROSS JOIN JSON_TABLE (
t2.col2,
'$[*]' COLUMNS (col2_new VARCHAR(255) PATH '$')
) AS jt;
But if you comment out the "LIMIT 100" clause you will get "Error Code: 1210. Incorrect arguments to JSON_TABLE"

Description: A specific query with JSON_TABLE and a LATERAL join fails to parse and throws the error "Error Code: 1210. Incorrect arguments to JSON_TABLE." Meanwhile, adding a LIMIT clause makes it work again, even though it's same data in both cases. How to repeat: The following query should output three rows with values (a,b,c): create table stuff (json_arr json); insert into stuff values ('["a","b","c"]'); select l.val from ( select json_arr from stuff -- limit 1000000 ) s, lateral ( select val from json_table( s.json_arr, '$[*]' columns ( val char(24) path '$' ) ) i ) l; The query as-is throws "Error Code: 1210. Incorrect arguments to JSON_TABLE". But if you uncomment "limit 1000000" then it will work and output the three rows. This bug also occurs if the table has no data. It is interesting to note that the bug does not occur if you omit the table entirely like so: select l.val from ( select cast('["a","b","c"]' as json) json_arr ) s, lateral ( select val from json_table( s.json_arr, '$[*]' columns ( val char(24) path '$' ) ) i ) l; Suggested fix: The query should be parsed regardless of how data arrives in the derived table.