Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

At least one SQL statement is required for execution #1828

Open
xlvchao opened this issue Sep 19, 2024 · 19 comments
Open

At least one SQL statement is required for execution #1828

xlvchao opened this issue Sep 19, 2024 · 19 comments
Assignees
Labels

Comments

@xlvchao
Copy link

xlvchao commented Sep 19, 2024

JDBC vsersion:0.4.6
Clickhouse version:21.8.10.19

When I do batch insert, sometimes I get this error:

java.sql.BatchUpdateException: At least one SQL statement is required for execution, server ClickHouseNode [uri=http://10.109.152.2:8123/default, options={jdbcCompliant=false,collect_timeout=3600,max_execution_time=3600,socket_timeout=4800000,max_queued_buffers=0}]@-232571340
at com.clickhouse.jdbc.SqlExceptionUtils.batchUpdateError(SqlExceptionUtils.java:107)
at com.clickhouse.jdbc.internal.SqlBasedPreparedStatement.executeAny(SqlBasedPreparedStatement.java:194)
at com.clickhouse.jdbc.internal.AbstractPreparedStatement.executeLargeBatch(AbstractPreparedStatement.java:85)
at com.clickhouse.jdbc.internal.ClickHouseStatementImpl.executeBatch(ClickHouseStatementImpl.java:754)
at com.zaxxer.hikari.pool.ProxyStatement.executeBatch(ProxyStatement.java:127)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeBatch(HikariProxyPreparedStatement.java)
at com.hihonor.aiops.clickhouse.component.SinkWriter.writeToClickhouse(SinkWriter.java:80)
at com.hihonor.aiops.databus.task.ClickHouseTask.doWriteToCk(ClickHouseTask.java:190)
at com.hihonor.aiops.databus.task.ClickHouseTask.writeToCk(ClickHouseTask.java:122)
at com.hihonor.aiops.databus.task.ClickHouseTask.tryWriteToCk(ClickHouseTask.java:105)
at com.hihonor.aiops.databus.task.ClickHouseTask.run(ClickHouseTask.java:96)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

No problem with the data, so what is the reason for this error?

@xlvchao xlvchao added the bug label Sep 19, 2024
@xlvchao
Copy link
Author

xlvchao commented Sep 19, 2024

@chernser Can you help me?

@chernser chernser added module-jdbc JDBC driver investigating Investigation of a root cause is on going labels Sep 19, 2024
@chernser
Copy link
Contributor

Hi @xlvchao!
it seems that statement call with empty batch. Do you use jdbc driver directly or inside some framework (like spark)?

@xlvchao
Copy link
Author

xlvchao commented Sep 19, 2024

@chernser Yes, i use jdbc driver directly in a springboot program!

@chernser
Copy link
Contributor

@xlvchao is it possible that batch can be empty?
Would you be able to give an sample code how JDBC is used? Thanks!

@xlvchao
Copy link
Author

xlvchao commented Sep 21, 2024

@zhicwu Bro, do you have any ideas?

@chernser
Copy link
Contributor

Hi @xlvchao !
Have you tried an upgrade to newer version?

@xlvchao
Copy link
Author

xlvchao commented Nov 17, 2024

Hi @xlvchao ! Have you tried an upgrade to newer version?

@chernser
Yes, i upgraded the version from 0.4.6 to 0.6.4, but the problem still exists!

When i reduced the batch size from 8000 to 1000 and then the problem disappeared! This is very strange!

Could it be that this problem is related to the length of SQL? When the batch size is 8000, the SQL size exceeds 30 MB!

@chernser
Copy link
Contributor

@xlvchao,
thank you for information! I will look into it today. I think it may be incorrect concatenation of statements or may be something with compression.
btw, do you have any complex types in inserts? I've found a bug in reading code for AggregateFunction - it may be a case for another type while writing. Usually such bugs are hard to reproduce and have different effect.

@chernser
Copy link
Contributor

Found similar problem #1259 (comment) - for some reason invalid statement caused empty statement.

The problem is in client

throw new IllegalArgumentException("At least one SQL statement is required for execution");

 List<String> stmts = sealedRequest.getStatements(false);
        int size = stmts.size();
        String sql;
        if (size == 0) {
            throw new IllegalArgumentException("At least one SQL statement is required for execution");
        } else if (size > 1) {
            throw new IllegalArgumentException("Expect one SQL statement to execute but we got " + size);
        } else {
            sql = stmts.get(0);
        }

@chernser
Copy link
Contributor

@xlvchao

I've investigated the code and found that it may happen in very few conditions if internal statement is SQL based (data is send via series of insert statements) and statement has no parameters. I think there is some invalid parsing of initial query. Would you please give a very close to original example of this statement and how do you send data?

Thanks!

@xlvchao
Copy link
Author

xlvchao commented Nov 23, 2024

@xlvchao

I've investigated the code and found that it may happen in very few conditions if internal statement is SQL based (data is send via series of insert statements) and statement has no parameters. I think there is some invalid parsing of initial query. Would you please give a very close to original example of this statement and how do you send data?

Thanks!

@chernser

Here is the table script:

create table aiops_local_prd.quantile_aiops_collect_93
(
    product     String,
    domain      String,
    interface   String,
    returnCode  String,
    requestType String,
    az          String,
    count       Float64,
    fail Nullable(Int64),
    success Nullable(Int64),
    request_time_quantile AggregateFunction(quantilesTDigest(0.95), Float64),
    upstream_response_time_quantile AggregateFunction(quantilesTDigest(0.95), Float64),
    body_bytes_sent_quantile AggregateFunction(quantilesTDigest(0.95), Float64),
    time        DateTime,
    createTime  DateTime
)
    engine = ReplicatedMergeTree('/clickhouse/tables/{layer}-{shard}/aiops_local_prd/quantile_aiops_collect_93',
             '{replica}')
        PARTITION BY toYYYYMMDD(time)
        ORDER BY time
        TTL time + toIntervalMonth(3)
        SETTINGS index_granularity = 8192;

Here is the insert template:

INSERT INTO aiops_local_prd.quantile_aiops_collect_93 (product,domain,interface,returnCode,requestType,az,count,fail,success,request_time_quantile,upstream_response_time_quantile,body_bytes_sent_quantile,time,createTime) VALUES (?,?,?,?,?,?,?,?,?,unhex(?),unhex(?),unhex(?),?,?)

Here is how JDBC is used:

public FlushResponse writeToClickhouse(String clickHouseNode, Long collectId, List<JSONObject> dataList) {
        String nodeTableKey = TableUtil.getTableMapKey(clickHouseNode,collectId);
        if (!tableInfoMap.containsKey(nodeTableKey)) {
            logger.warn("Failed while flush data to ClickHouse, because of table not exist, database = {}, collectId =  {}", database, collectId);
            failureHandler.handle(database, collectId, dataList, FailEnum.TABLE_NOT_EXIST, FailEnum.TABLE_NOT_EXIST.getMsg());
            return FlushResponse.tableNotExist();
        }
        SqlTemplate sqlTmpObject = tableInfoMap.get(nodeTableKey);
        String sqlTemplate = sqlTmpObject.getSqlTemplate();
        HikariDataSource dataSource = Globals.HIKARI_DATA_SOURCES.get(clickHouseNode);

        try (Connection conn = dataSource.getConnection();
             PreparedStatement prepareStatement = conn.prepareStatement(sqlTemplate)) {
            List<String> fields = sqlTmpObject.getFields();
            Map<String, FieldInfo> fieldInfoMap = sqlTmpObject.getFieldInfoMap();

            List<JSONObject> validDataList = new ArrayList<>();
            checkParameters(collectId, fields, fieldInfoMap, dataList, validDataList);
            assembleParameters(prepareStatement, fields, fieldInfoMap, validDataList);

            long s = System.currentTimeMillis();
            prepareStatement.executeBatch();
            long e = System.currentTimeMillis();
            successHandler.handle(database, collectId, dataList, (e - s));
            logger.info("SDK Successful flush data to ClickHouse, database = {}, collectId =  {},  elapsed = {}ms, valid data size = {}, invalid data size = {}", database, collectId, (e - s), validDataList.size(), dataList.size() - validDataList.size());
            return FlushResponse.flushSuccess();

        } catch (DataAssembleException e) {
            logger.error("SDK Failed while flush data to ClickHouse, because of data parse exception, database = {}, collectId =  {}", database, collectId, e);
            failureHandler.handle(database, collectId, dataList, FailEnum.DATA_PARSE_FAILED, FailEnum.DATA_PARSE_FAILED.getMsg());
            return FlushResponse.dataParseFailed();

        } catch (Exception e) {
            String message = e.getMessage();
            Exception newE = e;
            FailEnum failEnum = FailEnum.FLUSH_DATA_FAILED;
            FlushResponse flushResponse = FlushResponse.flushDataFailed();
            if (StringUtils.isNotBlank(message)) {
                if (message.contains(Globals.ZK_SESSION_EXPIRED)) {
                    newE = new ZkSessionExpiredException(message);
                    failEnum = FailEnum.ZK_SESSION_EXPIRED;
                    flushResponse = FlushResponse.zkSessionExpired();
                } else if (message.contains(Globals.ZK_TABLE_READ_ONLY)) {
                    newE = new ZkTableReadonlyException(message);
                    failEnum = FailEnum.ZK_TABLE_READ_ONLY;
                    flushResponse = FlushResponse.zkTableReadOnly();
                } else if (message.contains(Globals.ZK_ALLOCATE_FAILED)) {
                    newE = new ZkAllocateFailedException(message);
                    failEnum = FailEnum.ZK_ALLOCATE_FAILED;
                    flushResponse = FlushResponse.zkAllocateFailed();
                } else if (message.contains(Globals.CK_TOO_MANY_PARTS)) {
                    newE = new CkTooManyPartsException(message);
                    failEnum = FailEnum.CK_TOO_MANY_PARTS;
                    flushResponse = FlushResponse.ckTooManyParts();
                } else if (message.contains(Globals.CK_MATERIAL_VIEW_ERROR)) {
                    newE = new CkMaterialViewException(message);
                    failEnum = FailEnum.CK_MATERIAL_VIEW_ERROR;
                    flushResponse = FlushResponse.ckMaterialViewError();
                } else if (message.contains(Globals.CK_TABLE_COLUMNS_ERROR)) {
                    newE = new CkTableColumnsException(message);
                    failEnum = FailEnum.CK_TABLE_COLUMNS_ERROR;
                    flushResponse = FlushResponse.ckTableColumnsError();
                } else if (message.contains(Globals.CK_WRITING_REQUEST_ERROR)) {
                    newE = new CkWritingRequestException(message);
                    failEnum = FailEnum.CK_WRITING_REQUEST_ERROR;
                    flushResponse = FlushResponse.ckWritingRequestError();
                } else if (message.contains(Globals.CK_CONNECTION_ERROR)) {
                    newE = new CkConnectionException(message);
                    failEnum = FailEnum.CK_CONNECTION_ERROR;
                    flushResponse = FlushResponse.ckConnectionError();
                }
            }
            logger.error("SDK Failed while flush data to ClickHouse, because of clickhouse exception, ck node:{}, database = {}, collectId =  {}!", clickHouseNode, database, collectId, newE);
            failureHandler.handle(database, collectId, dataList, failEnum, failEnum.getMsg());
            return flushResponse;
        }
    }

private void checkParameters(Long collectId, List<String> fields, Map<String, FieldInfo> fieldInfoMap, List<JSONObject> dataList, List<JSONObject> validDataList) throws Exception {
        for (int i = 0; i <= dataList.size() - 1; i++) {
            JSONObject data = dataList.get(i);
            String field = "";
            Object fieldValue = null;
            String fieldType = "";
            try {
                for (int j = 0; j < fields.size(); j++) {
                    field = fields.get(j);
                    fieldValue = data.get(field);
                    fieldType = fieldInfoMap.get(field).getType();
                    if (null != fieldValue) {
                        if ("String".equals(fieldType) || "Nullable(String)".equals(fieldType)) {
                            if (!(fieldValue instanceof String)) throw new Exception();
                        } else if ("Int32".equals(fieldType) || "Nullable(Int32)".equals(fieldType)) {
                            Integer.parseInt(fieldValue.toString());
                        } else if ("Int64".equals(fieldType) || "Nullable(Int64)".equals(fieldType)) {
                            Long.parseLong(fieldValue.toString());
                        } else if ("Float32".equals(fieldType) || "Nullable(Float32)".equals(fieldType)) {
                            Float.parseFloat(fieldValue.toString());
                        } else if ("Float64".equals(fieldType) || "Nullable(Float64)".equals(fieldType)) {
                            Double.parseDouble(fieldValue.toString());
                        } else if ("DateTime".equals(fieldType) || "Nullable(DateTime)".equals(fieldType)) {
                            if (DateTimeUtil.convertToTimestamp(fieldValue.toString()) < 0) throw new Exception();
                        }
                    }
                }
                validDataList.add(data);
            } catch (Exception e) {
                Exception newE = new DataCheckException(String.format("CollectId:%s, fieldName:%s, fieldType:%s, fieldValue:%s", collectId, field, fieldType, fieldValue));
                logger.error("Find invalid parameters when check!", newE);
            }
        }
    }

private void assembleParameters(PreparedStatement prepareStatement, List<String> fields, Map<String, FieldInfo> fieldInfoMap, List<JSONObject> dataList) throws Exception {
        for (int i = 0; i < dataList.size(); i++) {
            JSONObject data = dataList.get(i);
            try {
                int idx = 1;
                for (String field : fields) {
                    Object value = data.get(field);
                    String type = fieldInfoMap.get(field).getType();
                    if (null == value) {
                        if ("String".equals(type)) {
                            prepareStatement.setString(idx++, "");
                        } else if ("Nullable(String)".equals(type)) {
                            prepareStatement.setNull(idx++, Types.VARCHAR);
                        } else if ("Int32".equals(type)) {
                            prepareStatement.setInt(idx++, 0);
                        } else if ("Nullable(Int32)".equals(type)) {
                            prepareStatement.setNull(idx++, Types.INTEGER);
                        } else if ("Int64".equals(type)) {
                            prepareStatement.setLong(idx++, 0L);
                        } else if ("Nullable(Int64)".equals(type)) {
                            prepareStatement.setNull(idx++, Types.BIGINT);
                        } else if ("Float32".equals(type)) {
                            prepareStatement.setFloat(idx++, 0F);
                        } else if ("Nullable(Float32)".equals(type)) {
                            prepareStatement.setNull(idx++, Types.FLOAT);
                        } else if ("Float64".equals(type)) {
                            prepareStatement.setDouble(idx++, 0D);
                        } else if ("Nullable(Float64)".equals(type)) {
                            prepareStatement.setNull(idx++, Types.DOUBLE);
                        } else if ("DateTime".equals(type)) {
                            prepareStatement.setString(idx++, DateTimeUtil.getCurrentUtcDatetime());
                        } else if ("Nullable(DateTime)".equals(type)) {
                            prepareStatement.setNull(idx++, Types.VARCHAR);
                        } else if (type.contains("AggregateFunction")) {
                            prepareStatement.setNull(idx++, Types.VARCHAR);
                        }
                    } else {
                        prepareStatement.setObject(idx++, value);
                    }
                }
                prepareStatement.addBatch();
            } catch (Exception e) {
                Exception newE = new DataAssembleException(data.toJSONString(), e);
                logger.error("Find invalid parameters when assemble!", newE);
                throw newE;
            }
        }
    }

Here is one of the data in a certain batch write:
Tips: I just picked one of them, but may not cause insert exception.

{"request_time_quantile":"91010000c0400000803f0000e0400000803f0000e0400000803f0000e0400000803f0000e0400000803f0000e0400000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f0000004100000040000000410000004000000841000000400000104100000040000010410000004000001041000000400000104100000040000010410000004000001041000000400000184100000040000020410000004000002041000000400000204100000040000020410000404000002041000040400000204100004040000020410000404000002041000040400000204100004040000020410000404000002041000040405555254100004040000030410000404000003041000040400000304100004040000030410000404000003041000040400000304100004040000030410000804000003041000080400000304100008040000030410000804000003041000080400000344100008040000040410000804000004041000080400000404100008040000040410000804000004041000080400000404100008040000040410000804000004041000080400000404100008040000050410000804000005041000080400000504100008040000050410000804000005041000080400000504100008040000050410000804000005041000080400000604100008040000060410000804000006041000080400000604100008040000060410000804000006041000080400000604100008040000060410000804000006c41000080400000704100008040000070410000804000007041000080400000704100008040000070410000804000007c410000804000008041000080400000804100008040000080410000804000008041000080400000804100008040000080410000804000008041000080400000804100008040000084410000804000008841000080400000884100008040000088410000804000008841000080400000884100008040abaa8a4100004040000090410000404000009041000040400000904100004040000090410000404055559541000040400000984100004040000098410000404000009841000040400000984100004040abaa9a41000040400000a041000040400000a041000040400000a041000040400000a041000040400000a041000000400000a841000000400000a841000000400000a841000000400000a841000000400000ac41000000400000b041000000400000b841000000400000c041000000400000c841000000400000d041000000400000fc410000004000000c420000004000002c4200000040000084420000803f000086420000803f00008c420000803f000098420000803f00009c420000803f00009e420000803f0000a0420000803f0000a0420000803f0000a8420000803f0000d0420000803f0000dc420000803f000002430000803f000008430000803f000013430000803f000017430000803f000025430000803f00002d430000803f000049430000803f","product":"AdPlatform","code":"200","requestType":"POST","__offset":3742997,"interface":"/api/ad-bizlog/v1/media/workflow","__topic":"quantile_aiops_collect_93","returnCode":"388","fail":"0","__partition":26,"createTime":"2024-09-18 15:49:16","success":"388","domain":"ads-adnet-drcn.cloud.honor.com","az":"None","body_bytes_sent_quantile":"91010000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f","time":"2024-09-14 12:46:00","upstream_response_time_quantile":"91016f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803fa69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040400ad7a33b000040400ad7a33b000040400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080403108ac3b00008040a69bc43b00008040a69bc43b00008040a69bc43b00008040a69bc43b00008040a69bc43b00008040a69bc43b00008040a69bc43b00008040a69bc43b00008040f4fdd43b000080404260e53b000080404260e53b000080404260e53b000080404260e53b000080404260e53b000080404260e53b00008040b6f3fd3b000080406f12033c000080406f12033c000080406f12033c000080406f12033c000080406f12033c00008040bc74133c00008040bc74133c00008040bc74133c00008040bc74133c00008040bc74133c00008040bc74133c00008040f0601e3c000040400ad7233c000040400ad7233c000040400ad7233c000040400ad7233c000040400ad7233c000040400ad7233c000040400ad7233c000040405839343c000040405839343c000040405839343c000040405839343c000040405839343c000040405839343c0000404072af393c00004040a69b443c00000040a69b443c00000040a69b443c00000040a69b443c00000040a69b443c00000040a69b443c00000040a69b443c00000040cdcc4c3c00000040f4fd543c000000401b2f5d3c000000404260653c0000004068916d3c00000040b6f37d3c000000401283c03c000000408fc2f53c0000803f31082c3d0000803ffca9713d0000803f022b873d0000803f4c37893d0000803f295c8f3d0000803fbc74933d0000803f2db29d3d0000803fc1caa13d0000803f5c8fc23d0000803f60e5d03d0000803fd9cef73d0000803f9318043e0000803f295c0f3e0000803f7368113e0000803f9cc4203e0000803f7b142e3e0000803f8195433e0000803f"}

@chernser
Copy link
Contributor

@xlvchao thank you so much for the information!
I will start with AggregateFunction and then replicating the issue.

@chernser chernser removed the investigating Investigation of a root cause is on going label Nov 25, 2024
@chernser chernser added this to the Priority Backlog milestone Nov 25, 2024
@chernser chernser self-assigned this Nov 25, 2024
@xlvchao
Copy link
Author

xlvchao commented Nov 26, 2024

@xlvchao thank you so much for the information!
I will start with AggregateFunction and then replicating the issue.

@chernser How's going, buddy?

@chernser
Copy link
Contributor

Good day, @xlvchao !

I've been looking into it yesterday.

  1. as you are using unhex() in the insert template - JDBC will not use RowBinary format and will just compile a big SQL statement. I'm not sure if it can be bypassed to avoid unhex()
  2. when batch is executed - SQL is built into a String twice and from different places (I will fix it) but it should not be a root cause.
  3. I've tested with different scenarios like empty batch, incomplete batch or all NULLs - it doesn't cause the error.

I will continue next week since Monday.

@xlvchao
Copy link
Author

xlvchao commented Nov 28, 2024

Good day, @chernser!

There is a key point I forgot to mention!

When this error occurs(At least one SQL statement...), after retrying several times with the same batch data, it succeeded!

@chernser
Copy link
Contributor

chernser commented Dec 2, 2024

Thank you for the additional information! That looks strange that it requires several retries. Because it is client error. I will look into it.

@xlvchao
Copy link
Author

xlvchao commented Dec 11, 2024

@chernser Hi, is everything ok? Have you found the root cause?

@chernser
Copy link
Contributor

Good day, @xlvchao !
Sorry - I'm working on other issues. But I remind myself about this one.
Currently I have no ideas :-(

@xlvchao
Copy link
Author

xlvchao commented Dec 20, 2024

Good day, @xlvchao !
Sorry - I'm working on other issues. But I remind myself about this one.
Currently I have no ideas :-(

@chernser hello,buddy😂

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants