[Feature][Flink] DataSource sql generated supports cdc and jdbc task different type conversion rules
Search before asking
- [X] I had searched in the issues and found no similar feature requirement.
Description
sql generated by data source supports cdc and jdbc task different type conversion rules。 jdbc type mapping, cdc type mapping
Use case
sql generated by data source supports cdc and jdbc task different type conversion rules
Related issues
No response
Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
Code of Conduct
- [X] I agree to follow this project's Code of Conduct
Hello @xiaofan2022, this issue is about CDC/CDCSOURCE, so I assign it to @aiwenmo. If you have any questions, you can comment and reply.
你好 @xiaofan2022, 这个 issue 是关于 CDC/CDCSOURCE 的,所以我把它分配给了 @aiwenmo。如有任何问题,可以评论回复。
Support for major connectors such as flink-cdc,flink-jdbc,flink-hive,flink-hudi mainstream sql field type mapping
Please explain the purpose and design plan
Purpose: sql generation function automatically generates synchronous sql (type mapping) based on connector Plan: First support jdbc type sql generated field type mapping
Purpose: sql generation function automatically generates synchronous sql (type mapping) based on connector Plan: First support jdbc type sql generated field type mapping
Can you implement this function?
I can do the backend. The frontend?
I can do the backend. The frontend?
Thank you very much for your participation. Please clarify the front-end requirements, and we will be responsible for their implementation.
The front end adds different connector types (such as CDC, JDBC, Hive, Hudi) when generating SQL statements, and the back end maps the corresponding fields according to the official documentation based on the connector type.
The front end adds different connector types (such as CDC, JDBC, Hive, Hudi) when generating SQL statements, and the back end maps the corresponding fields according to the official documentation based on the connector type.
you can provide a map data struct example of backend . i will be design the frontend
The front end adds different connector types (such as CDC, JDBC, Hive, Hudi) when generating SQL statements, and the back end maps the corresponding fields according to the official documentation based on the connector type.
I understand whether it can be designed as follows
- The front end adds a drop-down box in Generate sql -> Flinksql Tag (values such as: CDC, JDBC, Hive, Hudi)
- Then select and call the interface input parameters before adding the connector type input parameters. The interface generates results based on the connector type, and then returns
how?
yes
If you want to add flink-cdc type mapping, is to add a new flink-cdc-meta, and then put the corresponding logic into flink- CDC-meta?
@aiwenmo What do you think?
Put the corresponding logic into dinky-cdc.
Put the corresponding logic into dinky-cdc.
It would be inappropriate to put everything in dinky-cdc. In theory, the column type information on the data source module also needs to use the type conversion here to achieve a unified effect.
Is it possible to maintain different conversion logic for different sources (e.g. flink-cdc,jdbc,hive,hudi) in the conversion logic corresponding to the connector?
Hello @, this issue has not been active for more than 30 days. This issue will be closed in 7 days if there is no response. If you have any questions, you can comment and reply.
你好 @, 这个 issue 30 天内没有活跃,7 天后将关闭,如需回复,可以评论回复。