clickhouse-kafka-connect
clickhouse-kafka-connect copied to clipboard
ClassCastException due to invalid field value to column name mapping for Tuples
Describe the bug
The doWriteColValue method in ClickHouseWriter.java in case of tuples uses
Streams.zip(
col.getTupleFields().stream(), value.getFields().stream(), Tuples:: of
).forEach((fields) - > {
Column column = fields.getT1();
Field field = fields.getT2();
Data innerData = (Data) jsonMapValues.get(field.name());
try {
doWriteColValue(column, stream, innerData, defaultsSupport);
} catch (IOException e) {
throw new RuntimeException(e);
}
});
to get the field value from jsonMapValues
for a given column. However, since the col.getTupleFields()
and value.getFields()
are not sorted by the column names, a mismatch occurs when fetching column and field from the Streams.zip, which ultimately leads to ClassCastException in doWritePrimitive
Steps to reproduce
- Create a clickhouse table which uses Tuples having different datatypes like Int32, Int64, Float64, String, etc.
- Create a Kafka Connect MongoDB source connector to push data from collection to Kafka Topic.
- Create a Kafka Connect Clickhouse Sink Connector to read the messages on the Kafka topic and insert into Clickhouse table
Error log
java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String