flink-cdc
flink-cdc copied to clipboard
Flink CDC is a streaming data integration tool
see https://issues.apache.org/jira/browse/FLINK-35244 test case for flink-connector-tidb-cdc should under `org.apache.flink.cdc.connectors.tidb` package instead of `org.apache.flink.cdc.connectors`
Resolve [FLINK-35119](https://issues.apache.org/jira/browse/FLINK-35119). The `before` and `after` of DataChangeEvent participate in serialization in case of null value.
### Purpose Linked issue: close #2470 This is part 1, fixing the oracle module. If everything is okay, I'll proceed to submit pr for other modules, such as mysql, pg,...
According to [Doris documentation](https://doris.apache.org/docs/sql-manual/sql-reference/Data-Definition-Statements/Alter/ALTER-TABLE-COLUMN/), altering column types dynamically is supported (via `ALTER TABLE ... MODIFY COLUMN` statement) when lossless conversion is available. However, now Doris pipeline connector has no support...
add metrcis: currentFetchEventTimeLag, currentEmitEventTimeLag, sourceIdleTime for TiKVRichParallelSourceFunction issue: https://github.com/ververica/flink-cdc-connectors/issues/985
when use splitOneUnevenlySizedChunk, sometimes(for example, a database or a table or a column use 'utf8mb4_general_ci') we will get a big chunk. For example when the value of primaryKey like ['0000','1111','2222','3333','4444','aaaa','bbbb','cccc','ZZZZ',...]...
This PR implements [#1898](https://github.com/ververica/flink-cdc-connectors/issues/1898) and exposing support in configuration just for mysql module.