jalousiex
jalousiex
> I think this issue is because of `flink-s3-fs-presto-1.19.1.jar`,as the document recommends using it, we did it and failed. But when I'm using `flink-s3-fs-hadoop-1.19.1.jar`, all goods. thanks, just worked
it just kill the client, job still running at cluster
> mysql cdc uses the 3.2.0 package, then you should sink to paimon, and the error shows that paimon catalog does not support s3 for now 实际上0.8.2是支持的,这个任务成功跑起来了,日志显示jobGraph提交失败,restApi成功了 关于s3,flink在不同封装上表现差异比较大,原生支持较好,dinky提交也报类似错误失败,但streampark最终是跑起来了 另外,很奇怪动态参数没有生效,以下为重启streampark之后,运行custom任务的完整日志 ```...