Bug #108497 mysqlshell: util.exportTable does not include bucket information in output
Submitted: 15 Sep 2022 15:00 Modified: 15 Nov 2022 13:15
Reporter: Marc Reilly Email Updates:
Status: Closed Impact on me:
None 
Category:Shell Dump & Load Severity:S3 (Non-critical)
Version:8.0.30 OS:Any
Assigned to: CPU Architecture:Any
Tags: mysqlsh, s3

[15 Sep 2022 15:00] Marc Reilly
Description:
When exporting a table to s3 using util.exportTable there is a very useful util.importTable output that can be used to import the table on the destination. 

However, when using s3BucketName the suggested output is incorrect as it does not include the bucket information- this has to be manually added.

How to repeat:
1. export table to s3:
 MySQL  mydb.com:3306 ssl  JS > util.exportTable("sysbench.sbtest1", "dbtest1.tsv", {"s3BucketName": "my-s3-bucket"})
Initializing - done
Gathering information - done
Running data dump using 1 thread.
NOTE: Progress information uses estimated values and may not be accurate.
Starting data dump
101% (5.00M rows / ~4.94M rows), 438.45K rows/s, 86.94 MB/s
Dump duration: 00:00:10s
Total duration: 00:00:10s
Data size: 977.78 MB
Rows written: 5000000
Bytes written: 977.78 MB
Average throughput: 90.30 MB/s

The dump can be loaded using:
util.importTable("dbtest1.tsv", {
    "characterSet": "utf8mb4",
    "schema": "sysbench",
    "table": "sbtest1"
})

2. In above output you can see the "The dump can be loaded using..." util.importTable output does not contain options for the bucket used. Runing the above command will return the following output:
 MySQL  mydb.com:3306 ssl  JS > util.importTable("dbtest1.tsv", {
                                                                                       ->     "characterSet": "utf8mb4",
                                                                                       ->     "schema": "sysbench",
                                                                                       ->     "table": "sbtest1"
                                                                                       -> })
                                                                                       ->
Util.importTable: Cannot open file '/home/ec2-user/dbtest1.tsv': No such file or directory (RuntimeError)

3. Manually adding the bucket info and it completes successfully, would be nice if this was added automatically:
 MySQL  mydb.com:3306 ssl  JS > util.importTable("dbtest1.tsv", {
                                                                                       ->     "characterSet": "utf8mb4",
                                                                                       ->     "schema": "sysbench",
                                                                                       ->     "table": "sbtest1",
                                                                                       ->     "s3BucketName": "my-s3-bucket"
                                                                                       -> })
                                                                                       ->
Importing from file 'dbtest1.tsv' to table `sysbench`.`sbtest1` in MySQL Server at mydb.com using 8 threads
[Worker006] dbtest1.tsv: Records: 257275  Deleted: 0  Skipped: 0  Warnings: 0
[Worker002] dbtest1.tsv: Records: 256703  Deleted: 0  Skipped: 0  Warnings: 0
[Worker003] dbtest1.tsv: Records: 256703  Deleted: 0  Skipped: 0  Warnings: 0
[Worker007] dbtest1.tsv: Records: 255393  Deleted: 0  Skipped: 0  Warnings: 0
[Worker000] dbtest1.tsv: Records: 256561  Deleted: 0  Skipped: 0  Warnings: 0
[Worker001] dbtest1.tsv: Records: 255391  Deleted: 0  Skipped: 0  Warnings: 0
[Worker004] dbtest1.tsv: Records: 255391  Deleted: 0  Skipped: 0  Warnings: 0
[Worker005] dbtest1.tsv: Records: 255394  Deleted: 0  Skipped: 0  Warnings: 0
[Worker006] dbtest1.tsv: Records: 255393  Deleted: 0  Skipped: 0  Warnings: 0
[Worker002] dbtest1.tsv: Records: 255390  Deleted: 0  Skipped: 0  Warnings: 0
[Worker003] dbtest1.tsv: Records: 255394  Deleted: 0  Skipped: 0  Warnings: 0
[Worker004] dbtest1.tsv: Records: 255390  Deleted: 0  Skipped: 0  Warnings: 0
[Worker001] dbtest1.tsv: Records: 255391  Deleted: 0  Skipped: 0  Warnings: 0
[Worker000] dbtest1.tsv: Records: 255393  Deleted: 0  Skipped: 0  Warnings: 0
[Worker007] dbtest1.tsv: Records: 255392  Deleted: 0  Skipped: 0  Warnings: 0
[Worker005] dbtest1.tsv: Records: 255392  Deleted: 0  Skipped: 0  Warnings: 0
[Worker006] dbtest1.tsv: Records: 255392  Deleted: 0  Skipped: 0  Warnings: 0
[Worker004] dbtest1.tsv: Records: 141877  Deleted: 0  Skipped: 0  Warnings: 0
[Worker002] dbtest1.tsv: Records: 255390  Deleted: 0  Skipped: 0  Warnings: 0
[Worker003] dbtest1.tsv: Records: 255395  Deleted: 0  Skipped: 0  Warnings: 0
100% (977.78 MB / 977.78 MB), 21.50 MB/s
File 'dbtest1.tsv' (977.78 MB) was imported in 29.9523 sec at 32.64 MB/s
Total rows affected in sysbench.sbtest1: Records: 5000000  Deleted: 0  Skipped: 0  Warnings: 0

Suggested fix:
For ease of use, could the bucket information provided in the util.exportTable command be be included in the suggested import command?
[30 Sep 2022 10:01] MySQL Verification Team
Thank you for the report and feedback.
[15 Nov 2022 13:15] Edward Gilmore
Posted by developer:
 
Added the following note to the MySQL Shell 8.0.32 release notes:

When using util.exportTable() to export to an S3 bucket, the AWS-specific information, such as the S3 bucket
name, was not written to the suggested util.importTable() call.