Bug #91242 NPE from clusterJ when "get all" on a table containing TEXT or BLOB field
Submitted: 13 Jun 16:09 Modified: 18 Jun 10:56
Reporter: Andrew Tikhonov Email Updates:
Status: Verified Impact on me:
Category:MySQL Cluster: Cluster/J Severity:S2 (Serious)
Version:7.4.13 OS:Any
Assigned to: CPU Architecture:Any
Tags: clusterj BLOB TEXT NPE

[13 Jun 16:09] Andrew Tikhonov
ClusterJ is throwing NullPointerException when a "get all records" operation is performed on a table that contains BLOB or TEXT fields. 

Exception is thrown because at the time, blobs are ready to be fetched into internal arrays, the blob objects haven't been yet created. Creation of blob objects is a separate method call and it is possible that this call was mistakenly forgotten.

Caused by: java.lang.NullPointerException
	at com.mysql.clusterj.tie.BlobImpl.getLength(BlobImpl.java:57)
	at com.mysql.clusterj.tie.NdbRecordBlobImpl.readData(NdbRecordBlobImpl.java:110)
	at com.mysql.clusterj.tie.NdbRecordOperationImpl.loadBlobValues(NdbRecordOperationImpl.java:935)
	at com.mysql.clusterj.tie.NdbRecordScanResultDataImpl.next(NdbRecordScanResultDataImpl.java:139)
	at com.mysql.clusterj.core.query.QueryDomainTypeImpl.getResultList(QueryDomainTypeImpl.java:183)

How to repeat:
--- bug.sql ---

      `id` bigint(20) NOT NULL,
      `session` bigint(20) DEFAULT NULL COMMENT 'SESSION ID',
      PRIMARY KEY (`id`),
      UNIQUE KEY `testblob_UNIQUE` (`session`),
      INDEX `id_idx` (`id`),
      INDEX `session_idx` (`session`) USING BTREE
    ) ENGINE=ndbcluster DEFAULT CHARSET=latin1;

    insert into testblob (`id`,`session`,`payload`) values (1,11111111,'test payload 1');
    insert into testblob (`id`,`session`,`payload`) values (2,22222222,'test payload 2');
    insert into testblob (`id`,`session`,`payload`) values (3,33333333,'test payload 3');

--- Test.java ---

    static SessionFactory sessionFactory = createSessionFactory();

    @PersistenceCapable(table = "testblob")
    @JsonPropertyOrder({"id", "", "session", "payload"})
            @Index(name = "session_idx", columns = {@Column(name = "session")})
    public interface NdbClusterRecord {

        @JsonSerialize(using = ToStringSerializer.class)
        @Column(name = "id")
        long getId();
        void setId(long id);

        @JsonSerialize(using = ToStringSerializer.class)
        @Column(name = "session")
        long getSession();
        void setSession(long session);

        @Column(name = "payload")
        byte[] getPayload();
        void setPayload(byte[] payload);

    static private SessionFactory createSessionFactory() {

        final Properties props = new Properties();
        props.put("com.mysql.clusterj.connectstring", "localhost:1186");
        props.put("com.mysql.clusterj.database", "db");
        props.put("com.mysql.clusterj.connect.retries", "1");
        props.put("com.mysql.clusterj.connect.delay", "5");
        props.put("com.mysql.clusterj.connect.verbose", "1");
        props.put("com.mysql.clusterj.connect.timeout.before", "5");
        props.put("com.mysql.clusterj.connect.timeout.after", "5");
        props.put("com.mysql.clusterj.connection.pool.size", "1");

        return ClusterJHelper.getSessionFactory(props);

    private void doTest() {
        Session session = sessionFactory.getSession();

        try {
            QueryBuilder qb = session.getQueryBuilder();
            QueryDomainType<NdbClusterRecord> dobj = qb.createQueryDefinition(NdbClusterRecord.class);
            Query<NdbClusterRecord> query = session.createQuery(dobj);

            List<NdbClusterRecord> results = query.getResultList();

            System.out.println("size:" + results.size());
            for (NdbClusterRecord r : results) {
                System.out.println("r:" + r.getId() + " session:" +
                        r.getSession() + " payload:" + new String(r.getPayload()));

        } catch (Exception ex) {
        } finally {

Suggested fix:

class NdbRecordScanResultDataImpl extends NdbRecordResultDataImpl {

public boolean next() {

                        // SUGGESTED FIX:
                        // Need to activate blobs before
                        // reading data into internal arrays.

                        // load blob data into the operation
[18 Jun 10:56] Bogdan Kecman
Hi Andrew,

The behavior is verified as well as the patch to fix it but I'm not sure if this would break old code, will have to wait for the clusterj team to answer that one. I'm verifying the issue and shipping it to them so we'll see :). Thanks for the report and for the fix.

kind regards