flink-cdc
flink-cdc copied to clipboard
Flink CDC is a streaming data integration tool
Current , when sink is not instanceof TwoPhaseCommittingSink, use input.transform rather than stream. It means that pre-write topology will be ignored. ``` private void sinkTo( DataStream input, Sink sink, String...
[FLINK-35277](https://issues.apache.org/jira/browse/FLINK-35277) asncdcaddremove.sql The original insert statement for ASNCDC.IBMSNAP_PRUNCNTL is as follows: ```sql -- Original insert statement SET stmtSQL = 'INSERT INTO ASNCDC.IBMSNAP_PRUNCNTL ( ' || 'TARGET_SERVER, ' || 'TARGET_OWNER, '...
[FLINK-35272][cdc][runtime] Pipeline Transform job supports omitting / renaming calculation column
This closes [FLINK-35272](https://issues.apache.org/jira/browse/FLINK-35272). Currently, pipeline jobs with transform (including projection and filtering) are constructed with the following topology: ``` SchemaTransformOp --> DataTransformOp --> SchemaOp ``` where schema projections are applied...
[FLINK-35274](https://issues.apache.org/jira/projects/FLINK/issues/FLINK-35274)
[FLINK-35274](https://issues.apache.org/jira/projects/FLINK/issues/FLINK-35274) Fix occasional failure issue with Flink CDC Db2 UT
Cherry-picked from #3310, #3314.
Cherry-picked from #3310, #3314.
Currently, `MySqlE2eITCase`'s result verification strategy is as follows: First, it keeps waiting until the last expected record received, then fetch all outputs at once and check if all expected records...
This closes FLINK-35323. Currently, Transform module doesn’t support handling multiple tables with different schema within one projection / filtering rule. This is not intended and turns out to be related...