Description:
The SHOW CREATE LIBRARY command produces an incorrect hexadecimal representation of binary libraries on x86_64 architectures. On x86_64 platforms (where char is signed by default), the command inserts erroneous "FFFFFF" patterns before each byte with the high bit set, resulting in an invalid and bloated hex representation.
For example, on x86_64:
AS 0x3872FFFFFFD3FFFFFFBBFFFFFFA66A27FFFFFFA9FFFFFFA34F3834FFFFFFF3FFFFFF82FFFFFFBC1A7360FFFFFFD3FFFFFFBA0C5A7761FFFFFFE5FFFFFFE67A7B
While on AArch64:
AS 0x3872D3BBA66A27A9A34F3834F382BC1A7360D3BA0C5A7761E5E67A7B
The AArch64 representation is actually correct, as each byte is properly represented by exactly 2 hex digits, which is the standard way to represent binary data in hexadecimal format. On x86_64 (where char is signed by default), negative byte values undergo sign extension, resulting in "FFFFFF" patterns in the hex representation. On AArch64 (where char is unsigned by default), these patterns are absent.
This inconsistency causes test failures across different architectures and produces misleading, non-standard hexadecimal representations.
How to repeat:
Run the MySQL Test Run (MTR) test main.sp-library-binary on both x86_64 and AArch64 architectures:
```
[ 50%] main.sp-library-binary [ fail ]
Test ended at 2025-09-02 20:34:16
CURRENT_TEST: main.sp-library-binary
--- /quick-mysql-build/mysql-server/mysql-test/r/sp-library-binary.result 2025-09-02 23:18:41.432449980 +0300
+++ /quick-mysql-build/build/mysql-test/var/log/sp-library-binary.reject 2025-09-02 23:34:14.844343433 +0300
@@ -44,7 +44,7 @@
binary_library_binary ONLY_FULL_GROUP_BY,STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_ENGINE_SUBSTITUTION CREATE LIBRARY `binary_library_binary`
COMMENT 'A binary library stored in binary encoding'
LANGUAGE WASM
-AS 0x3872FFFFFFD3FFFFFFBBFFFFFFA66A27FFFFFFA9FFFFFFA34F3834FFFFFFF3FFFFFF82FFFFFFBC1A7360FFFFFFD3FFFFFFBA0C5A7761FFFFFFE5FFFFFFE67A7B
+AS 0x3872D3BBA66A27A9A34F3834F382BC1A7360D3BA0C5A7761E5E67A7B
```
The test fails on AArch64 because it expects the incorrect x86_64 representation with "FFFFFF" patterns, but AArch64 correctly produces a standard hex representation.
Suggested fix:
The issue occurs in the binary_to_hex function in sp.cc. The recommended solution is to prevent sign extension on all platforms by using a bit mask to isolate only the relevant 8 bits of each byte:
This approach masks off the higher bits with & 0xFF, preventing sign extension regardless of platform. It would produce the more compact and correct representation (matching current AArch64 behavior) on all architectures.
This solution produces the standard hexadecimal representation where each byte is represented by exactly 2 hex digits, which is the universally accepted format for binary data. It would require updating the existing test result to match the correct behavior.
An alternative solution would be to cast each byte to unsigned char first:
This fix explicitly casts each byte to unsigned char first, then to unsigned int, ensuring no sign extension occurs on any platform. This produces consistent output that matches the correct AArch64 behavior. This too also require updating the existing test result to match the correct behavior.
Description: The SHOW CREATE LIBRARY command produces an incorrect hexadecimal representation of binary libraries on x86_64 architectures. On x86_64 platforms (where char is signed by default), the command inserts erroneous "FFFFFF" patterns before each byte with the high bit set, resulting in an invalid and bloated hex representation. For example, on x86_64: AS 0x3872FFFFFFD3FFFFFFBBFFFFFFA66A27FFFFFFA9FFFFFFA34F3834FFFFFFF3FFFFFF82FFFFFFBC1A7360FFFFFFD3FFFFFFBA0C5A7761FFFFFFE5FFFFFFE67A7B While on AArch64: AS 0x3872D3BBA66A27A9A34F3834F382BC1A7360D3BA0C5A7761E5E67A7B The AArch64 representation is actually correct, as each byte is properly represented by exactly 2 hex digits, which is the standard way to represent binary data in hexadecimal format. On x86_64 (where char is signed by default), negative byte values undergo sign extension, resulting in "FFFFFF" patterns in the hex representation. On AArch64 (where char is unsigned by default), these patterns are absent. This inconsistency causes test failures across different architectures and produces misleading, non-standard hexadecimal representations. How to repeat: Run the MySQL Test Run (MTR) test main.sp-library-binary on both x86_64 and AArch64 architectures: ``` [ 50%] main.sp-library-binary [ fail ] Test ended at 2025-09-02 20:34:16 CURRENT_TEST: main.sp-library-binary --- /quick-mysql-build/mysql-server/mysql-test/r/sp-library-binary.result 2025-09-02 23:18:41.432449980 +0300 +++ /quick-mysql-build/build/mysql-test/var/log/sp-library-binary.reject 2025-09-02 23:34:14.844343433 +0300 @@ -44,7 +44,7 @@ binary_library_binary ONLY_FULL_GROUP_BY,STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_ENGINE_SUBSTITUTION CREATE LIBRARY `binary_library_binary` COMMENT 'A binary library stored in binary encoding' LANGUAGE WASM -AS 0x3872FFFFFFD3FFFFFFBBFFFFFFA66A27FFFFFFA9FFFFFFA34F3834FFFFFFF3FFFFFF82FFFFFFBC1A7360FFFFFFD3FFFFFFBA0C5A7761FFFFFFE5FFFFFFE67A7B +AS 0x3872D3BBA66A27A9A34F3834F382BC1A7360D3BA0C5A7761E5E67A7B ``` The test fails on AArch64 because it expects the incorrect x86_64 representation with "FFFFFF" patterns, but AArch64 correctly produces a standard hex representation. Suggested fix: The issue occurs in the binary_to_hex function in sp.cc. The recommended solution is to prevent sign extension on all platforms by using a bit mask to isolate only the relevant 8 bits of each byte: This approach masks off the higher bits with & 0xFF, preventing sign extension regardless of platform. It would produce the more compact and correct representation (matching current AArch64 behavior) on all architectures. This solution produces the standard hexadecimal representation where each byte is represented by exactly 2 hex digits, which is the universally accepted format for binary data. It would require updating the existing test result to match the correct behavior. An alternative solution would be to cast each byte to unsigned char first: This fix explicitly casts each byte to unsigned char first, then to unsigned int, ensuring no sign extension occurs on any platform. This produces consistent output that matches the correct AArch64 behavior. This too also require updating the existing test result to match the correct behavior.