flink-connector-kafka
flink-connector-kafka copied to clipboard
Apache flink
Modified the documentation on using the right dependency for 'properties.sasl.jaas.config'. When using 'flink-sql-connector-kafka.jar', existing document doesn't use the shaded dependency. Also added the name of the jar file (flink-sql-connector-kafka-x.xx.x.jar).
[FLINK-32743][Connectors/Kafka] Parse data from kafka connect and convert it into regular JSON data
Add "record.key.include.kafka.connect.json.schema" and "record.value.include.kafka.connect.json.schema" configurations. When this configuration is true, records can be converted into regular JSON data
# What is the purpose of the change In the CachingTopicSelector, a memory leak may occur when the internal logic fails to check the cache size due to a race...
1. add `scan.bounded.mode`, `scan.startup.specific-offsets`, `scan.bounded.specific-offsets` to the forward options. 2. remove `sink.parallelism` from the forward options. 3. fix the `scan.bounded.mode` docs. 4. improve Chinese docs.
## What is the purpose of the change [FLINK-32893](https://issues.apache.org/jira/browse/FLINK-32893) There's some use cases to use the same clientID across all of the low level Kafka consumers to 1. Simplify how...
## What is the purpose of the change * [JUnit5 Migration] Module: flink-connector-kafka. * ## Brief change log - *Updated simple junit 4 test packages to junit 5 test packages*...
- Removing unnecessary transition symbols'/'in the document ` 'properties.sasl.jaas.config' = 'org.apache.kafka.common.security.scram.ScramLoginModule required username=\"username\" password=\"password\";'`  - using transition characters can cause "Value not specified for key 'username' in JAAS config"...
## What is the purpose of the change As this thread https://lists.apache.org/thread/l98pc18onxrcrsb01x5kh1vppl7ymk2d discussed. Connectors shouldn't rely on dependencies that may or may not be available in Flink itself. But currently...
This PR drops support for Flink 1.17 & 1.18 and fix tests for 1.20-SNAPSHOT.
This PR came out of debuging a warning we’re seeing in our Flink logs. We’re running Flink 1.18 and have an application that uses Kafka topics as a source and...