Description:
Using "mysqlx" (aka X DevAPI) with Connector/Python is a lot slower than the traditional mysql.connector interface.
This is what my testing shows.
mysqlx: fetched 1000 rows in elapsed=0.18699979782104492
mysql.connector: fetched 1000 rows in elapsed=0.06336450576782227
mysqlx: fetched 5000 rows in elapsed=0.8569087982177734
mysql.connector: fetched 5000 rows in elapsed=0.2925424575805664
mysqlx: fetched 10000 rows in elapsed=1.7030038833618164
mysql.connector: fetched 10000 rows in elapsed=0.5785782337188721
mysqlx: fetched 100000 rows in elapsed=16.321004152297974
mysql.connector: fetched 100000 rows in elapsed=5.475654363632202
How to repeat:
import mysqlx
import mysql.connector
import time
params = {'host': '127.0.0.1', 'user': 'myuser', 'password': 'xxxxxxxx'}
for rows in [1000, 5000, 10000, 100000]:
session = mysqlx.get_session(params)
schema = session.get_schema('test')
tasks = schema.get_table('big_table')
start = time.time()
tasks.select().limit(rows).execute().fetch_all()
elapsed = time.time() - start
print(f"mysqlx: fetched {rows:6d} rows in {elapsed=}")
con = mysql.connector.connect(**params)
cur = con.cursor()
start = time.time()
cur.execute(f"SELECT * FROM dbang.tasks_task LIMIT {rows}")
[x for x in cur]
elapsed = time.time() - start
print(f"mysql.connector: fetched {rows:6d} rows in {elapsed=}")
cur.close()
con.close()