dinky
dinky copied to clipboard
[Bug] [CDCSOURCE] CDCSOURCE从mysql同步到doris报ERROR:java.lang.NoClassDefFoundError: Lorg/apache/kafka/connect/json/JsonConverter
Search before asking
- [X] I had searched in the issues and found no similar issues.
What happened
按照官方的说明,已经把dlink-client-base-${version}.jar、dlink-common-${version}.jar、dlink-client-${version}.jar也放到了flink/lib下面,用的版本flink1.13.6,doris1.2.1,在dinky上一运行任务就报ERROR: java.lang.NoClassDefFoundError: Lorg/apache/kafka/connect/json/JsonConverter,看起来像是缺少jar,就是不知道缺少哪个jar?
What you expected to happen
从mysql整库同步到doris
How to reproduce
EXECUTE CDCSOURCE all_test WITH ( 'connector' = 'mysql-cdc', 'hostname' = 'xxxx', 'port' = '3306', 'username' = 'xxxx', 'password' = 'xxxx', 'checkpoint' = '60000', 'scan.startup.mode' = 'initial', 'parallelism' = '1', 'table-name' = 'hhb_report.shipments,hhb_report.t_shipments_mysql', 'sink.connector' = 'doris', 'sink.fenodes' = 'xxxx:8080', 'sink.username' = 'xxxx', 'sink.password' = 'xxxx', 'sink.sink.max-retries' = '1', 'sink.sink.batch.interval' = '5', 'sink.sink.db' = 'test', 'sink.table.prefix' = 'dwd_', 'sink.table.upper' = 'true', 'sink.table.identifier' = '${schemaName}.${tableName}', 'sink.sink.enable-delete' = 'true' );
Anything else
No response
Version
0.7.0
Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
Code of Conduct
- [X] I agree to follow this project's Code of Conduct
这可能是依赖有冲突,可以看下plugins
我试了用flink1.16.0,相同的操作,是没有问题的
看下plugins下的依赖
dinky0.7的版本
可能doris、cdc依赖中存在冲突。降低cdc依赖版本试试
Dinky0.7的版本
你解决了吗?我也是相同的问题 cdc所有版本都试了还是有问题 单表同步可以整库就有问题,解决的话能告诉我一下解决方案吗? 感谢 !!
1.13的版本没去试了,测试直接用1.16的没有问题,相容的操作
Dinky 中 1.13 的整库同步里的 MysqlJsonDebeziumDeserializationSchema 的依赖使用的是 org.apache.kafka.connect.json.JsonConverter
, 不能用胖包,需要用 kafka-client.jar 等依赖。
Dinky 中 1.16 的整库同步里的 MysqlJsonDebeziumDeserializationSchema 的依赖使用的是
com.ververica.cdc.connectors.shaded.org.apache.kafka.connect.json.JsonConverter
,用胖包,只需要 flink-sql-connector-mysql-cdc-*.jar.
解决方式:推荐修改dinky 1.13的代码,改为 shaded 依赖。
- [X] #1778