[FLINK-36517][cdc-connect][paimon] use filterAndCommit API for Avoid commit the same datafile duplicate
https://issues.apache.org/jira/browse/FLINK-35938 problem still persists.
storeMultiCommitter.commit API may cause the same datafile commit twice when job restart from failure.
@lvyanquan @leonardBang PTAL
Could you please assist in reviewing this PR? Thank you. @lvyanquan
I agree that the issue of duplicate commits still exists. Our testing in the case of abnormal failover is relatively lacking, can you try adding corresponding test case for this?
I agree that the issue of duplicate commits still exists. Our testing in the case of abnormal failover is relatively lacking, can you try adding corresponding test case for this?
I will try, thanks.
The test fails because the checkpointId is always 1. I'll fix it. https://github.com/apache/flink-cdc/pull/3652/files
@beryllw Would you like to backport this fix to release-3.2 branch?
OK.