kafka-connect-jdbc
kafka-connect-jdbc copied to clipboard
Kafka Connect connector for JDBC-compatible databases
Converting byte[] to Kafka Connect data failed due to serialization error of topic connect-topic1:
Even though I'm passing the right schema registry and the converter class I'm still getting this error ``` 2022-04-11 11:47:20,038 ERROR WorkerSinkTask{id=databus-sink-connector1-0} Error converting message value in topic 'connect-topic1' partition...
I am trying to enable dead letter on my JDBC Sink Connector. In sink connector configuration, I have provided following properties ``` 'errors.tolerance'='all', 'errors.deadletterqueue.topic.name' = 'error_topic', 'errors.deadletterqueue.topic.replication.factor'= -1 ``` But...
Hi, We are creating a sink connector along with the error handling mechanism , so if in case if there is bad record it is routed to error queue ,...
It is impossible to stop the source connector before it has exhausted all its connection attempts. If you configured wrong DB connections params or your DB is unreachable, you can...
Hi All, version: Kafka-Connect-jdbc:10.4.1 My issue is exactly same as what was described in below case . I am getting error while inserting null value to column of type binary...
Error happens when sink connector attempts to insert record with binary data type (varbinary) that is null value. I have Inspected with SQL profiler nvarchar(4000) data type is built instead...
Hello, I use Kafka connect JDBC sink to write data to Oracle. Once the BLOB field type is added, it becomes very slow. For example, I define the writing batch...
I have several Oracle tables with fields defined as NUMBER(19,0). I configured numeric.precision.mapping=true and was expecting long but I'm getting bytes. On [DataConverter.java#L208](https://github.com/confluentinc/kafka-connect-jdbc/blob/master/src/main/java/io/confluent/connect/jdbc/source/DataConverter.java#L208) and [DataConverter.java#L399](https://github.com/confluentinc/kafka-connect-jdbc/blob/master/src/main/java/io/confluent/connect/jdbc/source/DataConverter.java#L399) there is a check for...
Hello All, I am new to Kafka and exploring Debezium for Change Data Capture. I have used Debezium source connector to read data from Oracle database. In the Kafka Topic,...
Hi, Based on **table.poll.interval.ms** parameter definition the following query constantly running from each Kafka JDBC connector. We configured our jdbc kafka connector jobs with query mode so we don't want...