flink-cdc
flink-cdc copied to clipboard
Flink CDC is a streaming data integration tool
Due to the issue mentioned in https://github.com/ververica/flink-cdc-connectors/issues/1406, Flink CDC Connectors might treat Timestamp fields with different behaviors during snapshot and streaming phases. Here I propose changing the default value of...
If your table has no primary key,you will not see any data in flink (no matter history or new ) and has no error message.
When I am using flink1.14+flink cdc 2.2.1, because the flink version depends on the source code is 1.13.5, it will prompt that the dependency org.apache.flink:flink-shaded-guava:18.0-13.0 is missing, if you refer...
Add two frequently used parameters and precautions
[oracle] The general **dbname** should be SERVICE_NAME but not INSTANCE_NAME in the _lsnrctl status_ output, especially connect to PDB in CDB it must be SERVICE_NAME now, of course we should...
将changelog-json内部依赖的flink-json,创建JsonRowDataSerializationSchema实例时使用了encode.decimal-as-plain-number默认值。本次修改使该选项放开到changelog-json的选项可配置,以解决在使用了Decimal类型字段场景中会序列化为科学计数法的问题。
add postgres-cdc debezium.plugin.name to source table definition. without debezium.plugin.name, it will raise an exception.
EOFException will happen when upgrading from mysql-cdc 2.2.0 and restoring from a savepoint. The exception stack is ac follows: `java.io.EOFException at org.apache.flink.core.memory.DataInputDeserializer.readBoolean(DataInputDeserializer.java:125) at com.ververica.cdc.connectors.mysql.source.split.MySqlSplitSerializer.deserializeSplit(MySqlSplitSerializer.java:165) at com.ververica.cdc.connectors.mysql.source.split.MySqlSplitSerializer.deserialize(MySqlSplitSerializer.java:124)` The reason is that...
The previous logic is to do the first conversion with OceanBaseJdbcConverter helper methods, and then use runtime converters in JdbcValueConverters to do the second conversion. Now we need only do...