flink-cdc icon indicating copy to clipboard operation
flink-cdc copied to clipboard

Flink CDC is a streaming data integration tool

Results 606 flink-cdc issues
Sort by recently updated
recently updated
newest added

[debezium] bump debezium version to 1.9.2.

**Describe the bug(Please use English)** ----------------------- ORA-00600: internal error code, arguments: [krvrdccs10], [], [], [], [], [], [], [], [], [], [], [] ---------------------- **Environment :** - Flink version :...

bug

postgres-cdc.md DataStream Source The code on the example needs to be added to make the user understand and use it better. **Environment :** - Flink version : 1.13.3 - Flink...

bug

The value type of XDBMASK field is TIMESTAMP in SqlServer (this field data is binary in SqlServer), but the value type of XDBMASK field is BIGint in mysql,When data is...

bug

**Describe the bug(Please use English)** A clear and concise description of what the bug is. Synchronous error when the amount of data is millions, the same table data volume of...

bug

**Describe the bug(Please use English)** Binlog will have mixed replcation format when Cloud db like CDB HA transition and send statement format replication to binlog In debezium 1.5.4 Final, it...

bug

**Is your feature request related to a problem? Please describe.** 对应订单es模型(orderitem使用nested object),如何把 orderitems 构造成 ARRAY>写入elasticsearch **Describe the solution you'd like** 使用易配置的方式实现,无须java代码定义UDF。

enhancement

**Describe the bug(Please use English)** cdc一个表时,内部代码获取列元数据时多了两条数据记录导致空指针,并且print时少了两行数据 **Environment :** - Flink version : 1.14.5 - Flink CDC version: 2.3-snapshot - debezium-connector-postgres-1.6.4.Final.jar - Database and version: postgresql 12.4 **To Reproduce** 有一个表test,**55个列**,2万多行数据,无论是通过flink sql还是streaming...

bug

**Describe the bug(Please use English)** I need to create a real-time mirror for the database, so I monitor all the tables in the database (500 tables) .But in the full...

bug

This closes https://github.com/ververica/flink-cdc-connectors/issues/2691. * support value format of `debeium-json` and `canal-json`. * The written topic of Kafka will be `namespace.schemaName.tableName` string of TableId,this can be changed using `route` function of...

docs
common