kafka-connect-jdbc icon indicating copy to clipboard operation
kafka-connect-jdbc copied to clipboard

Kafka Connect connector for JDBC-compatible databases

Results 208 kafka-connect-jdbc issues
Sort by recently updated
recently updated
newest added

## Problem We wanted to use this JDBC Connector not to produce messages that expose our database schema but clean JSON messages that have been prepared in a database table...

## Problem Postgres partitioned tables are not supported [after this change in the driver](https://github.com/pgjdbc/pgjdbc/commit/25eb32c8681eaa4aaac801808b6028e9f5dfbea8#diff-0571f8ac3385a7f7bb34e5c77f8afd24810311506989379c2e85c6c16eea6ce4L1287). #1092 ## Solution Map `PARTITIONED TABLE` as a valid value; not yet enabled by default, although...

Scenario: Kafka-connect with JDBC Source connector talking to SQLServer is up and running and polling a database table. The SQLServer is taken down. The Kafka-connect log file shows: [2019-12-12 09:58:14,124]...

bug
triaged

Need info for below use case: Use case 1: 1 source connector task.max = 2 table.whitelist=table1 For this property how many JDBC connection will get created. Use case 2: 1...

## Problem Some configurations and many variables throughout the code base use racially charged terms. ## Solution - Introduce new configurations. Existing configurations using racially charged terms are marked as...

Hi I need to set up kafka-connect with timestamp increment with Firebird. However I got an error: DEBUG TimestampIncrementingTableQuerier{table="CATEGORIES", query='null', topicPrefix='FB_', incrementingColumn='', timestampColumns=[LAST_MODIFIED]} prepared SQL query: SELECT * FROM "CATEGORIES"...

enhancement
help wanted
dialect
triaged

Using timestamp mode with an Informix database doesn't work source configuration: ```json name: my_connector config: 'connector.class': io.confluent.connect.jdbc.JdbcSourceConnector 'tasks.max': 1 'key.converter': org.apache.kafka.connect.json.JsonConverter 'value.converter': org.apache.kafka.connect.json.JsonConverter 'connection.url': '...' 'mode': timestamp 'timestamp.column.name': create_stamp 'topic.prefix':...

Hi I have installed the confluent local version - 7.1.1. After installing the kafka-connect-jdbc getting this error `An unexpected error occurred: org.apache.kafka.common.config.ConfigException: Missing required configuration "connection.url" which has no default...

## Problem The current implementation of kafka-connect-jdbc does not properly support Redshift. Adding multiple columns does not work. Redshift converts the TEXT type to VARCHAR(256) by default. For many data,...

## Problem ## Solution ##### Does this solution apply anywhere else? - [ ] yes - [ ] no ##### If yes, where? ## Test Strategy ##### Testing done: -...