Robin Moffatt

Results 34 comments of Robin Moffatt

https://dev.mysql.com/doc/refman/8.0/en/fixed-point-types.html > In MySQL, NUMERIC is implemented as DECIMAL So the following DDL: ``` CREATE TABLE NUM_TEST ( TXN_ID INT, CUSTOMER_ID INT, AMOUNT_01 DECIMAL(5,2), AMOUNT_02 NUMERIC(5,2), AMOUNT_03 DECIMAL(5), AMOUNT_04 DECIMAL...

Contrast to Postgres: ``` CREATE TABLE NUM_TEST ( TXN_ID INT, CUSTOMER_ID INT, AMOUNT_01 DECIMAL(5,2), AMOUNT_02 NUMERIC(5,2), AMOUNT_03 DECIMAL(5), AMOUNT_04 DECIMAL ); ``` All columns are stored as `NUMERIC`: ``` demo=#...

MS SQL notes : https://gist.github.com/rmoff/7bb46a0b6d27982a5fb7a103bb7c95b9#testing-numericmapping-in-ms-sql-server-2017 | | col1 | col2 | col3 | col4 | -|-|-|-|-| MSSQL column definition | `DECIMAL(5,2)` | `NUMERIC(5,2)` | `DECIMAL(5)` | `DECIMAL`| MSSQL created column...

`DECIMAL` isn't supported for `numeric.mapping`. There isn't a way to work around this that I'm aware of. The data isn't "corrupt", it's just a BigDecimal. For more details see https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector#bytes-decimals-numerics

Related: https://issues.apache.org/jira/browse/KAFKA-5117

Is there a JDBC driver for Clickhouse? Have you tried it and found problems? If so what? And with which version of Kafka Connect? And JDBC source, or sink?

Related: https://rmoff.net/2019/10/15/skipping-bad-records-with-the-kafka-connect-jdbc-sink-connector/

See https://github.com/confluentinc/kafka-connect-jdbc/pull/999

``` "value.converter": "org.apache.kafka.connect.json.JsonConverter", "value.converter.schemas.enable": "false", ``` The JDBC Sink requires a schema to your data. I'm not sure why this is triggering the error you're seeing, but you definitely to...