kafka-connect-jdbc
kafka-connect-jdbc copied to clipboard
Cannot handle arrays when decimal.handling.mode as double
I know that #805 states clearly that only helps for primitive type arrays, however there's a conflict with setting decimal.handling.mode to double.
This setting is a common workaround to deal with NUMERIC or DECIMAL types are used without scale constraints since there is no mapping for VariableScaleDecimal struct when using debezium postgres connector.
As I understand, having a numeric or decimal array along with the aforementioned setting results in having a non primitive double array but an array of Double references, equivalent to Java Double[] , so connector closes with the following exception:
org.postgresql.util.PSQLException: Cannot cast an instance of [Ljava.lang.Double; to type Types.ARRAY at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:652) at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:887) at io.confluent.connect.jdbc.dialect.PostgreSqlDatabaseDialect.maybeBindPrimitive(PostgreSqlDatabaseDialect.java:439)
I fully understand that the purpose of this fix was to handle at least primitive type arrays but I just wanted to let you know that it would be of great help to us to expand the fix since numeric[] type is commonly used in our schema. Additionally any other ideas in handling this are more than welcome. Thanks!
any update here @andrikoz
i'm facing the same issue. could you please take a look for me https://github.com/confluentinc/kafka-connect-jdbc/issues/1127