kafka-connect-jdbc icon indicating copy to clipboard operation
kafka-connect-jdbc copied to clipboard

Cannot handle arrays when decimal.handling.mode as double

Open andrikoz opened this issue 3 years ago • 1 comments

I know that #805 states clearly that only helps for primitive type arrays, however there's a conflict with setting decimal.handling.mode to double.

This setting is a common workaround to deal with NUMERIC or DECIMAL types are used without scale constraints since there is no mapping for VariableScaleDecimal struct when using debezium postgres connector.

As I understand, having a numeric or decimal array along with the aforementioned setting results in having a non primitive double array but an array of Double references, equivalent to Java Double[] , so connector closes with the following exception:

org.postgresql.util.PSQLException: Cannot cast an instance of [Ljava.lang.Double; to type Types.ARRAY at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:652) at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:887) at io.confluent.connect.jdbc.dialect.PostgreSqlDatabaseDialect.maybeBindPrimitive(PostgreSqlDatabaseDialect.java:439)

I fully understand that the purpose of this fix was to handle at least primitive type arrays but I just wanted to let you know that it would be of great help to us to expand the fix since numeric[] type is commonly used in our schema. Additionally any other ideas in handling this are more than welcome. Thanks!

andrikoz avatar May 14 '21 23:05 andrikoz

any update here @andrikoz

i'm facing the same issue. could you please take a look for me https://github.com/confluentinc/kafka-connect-jdbc/issues/1127

dungnt081191 avatar Oct 21 '21 16:10 dungnt081191