Compatibility problem with Flink 1.14.*
java.lang.NoClassDefFoundError: org/apache/flink/shaded/guava18/com/google/common/util/concurrent/ThreadFactoryBuilder
Environment :
- Flink version : 1.14.3
- Flink CDC version: 2.2.0
- Database and version: MySQL 8.0.*
To Reproduce
Steps to reproduce the behavior:
Class com.ververica.cdc.debezium.DebeziumSourceFunction in com.ververica:flink-connector-debezium:2.2.0 has dependency (import) from org.apache.flink.shaded.guava18.com.google.common.util.concurrent.ThreadFactoryBuilder which is not available in Flink 1.14.* (it was used in Flink 1.13.*).
Is there a temporary fix for this issue?
+1
@swetakala you can consider building fink-cdc-connectors 2.2.0 source code using flink-shaded-guava-30-*.jar, for more details see https://blog.csdn.net/weixin_40455124/article/details/122247436
use stream api with using pom dependency I encoutered same problem with flink 1.14.3 which using 30 shaded guava. job run on cluster will fail since it cannot find shaded 18 guava.
shaded guava is the dependency for both flink-cdc-connectors 2.2.0 and flink streaming 1.14.4 , while these two use different version.
When using guava 18
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flink/shaded/guava30/com/google/common/collect/Lists at org.apache.flink.streaming.api.transformations.LegacySinkTransformation.getTransitivePredecessors(LegacySinkTransformation.java:116)
I have this same problem with the mongo connector! Will there be a fix?
how do we build job and package jar, so that it can be deploy it in a cluster after making changes on source in CI process?
+1 ,mark
Can we contribute and fix the issue? Can someone guide me on how we can start off?
We are facing the same problem on my team with the connector MongoDB. Anyone knows if will there be a fix? Or any kind of plan?
Environment:
Flink version: 1.14.* Flink CDC version: 2.2.1 Database and version: MongDB 4.4.*
+1 mark
+1
is this problem solved
is there a ticket opened for this issue? Or shall I open one?
Any updates on this?
Found this issue which suggests to use the "SQL-jar" instead.
My guess is that this means com.ververica:flink-sql-connector-mongodb-cdc (in your case flink-sql-connector-mysql-cdc) which fixes the exception, but I'm completely new to CDC and still have to find out if this introduces any consequences. (I want to use the DataStream API).
Only found a FAQ entry about the SQL jar: It's the fat jar with all dependencies.
I recently tried to use Flink 1.14.3 and with cdc 2.1.1 release, I did not get any compatibility issues. May be later versions had fixed. Here are the list of dependencies I added:
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-jdbc_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-runtime-web_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java-bridge_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-common</artifactId>
<version>1.13.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.ververica</groupId>
<artifactId>flink-connector-mysql-cdc</artifactId>
<version>2.2.1</version>
</dependency>
<dependency>
<groupId>com.ververica</groupId>
<artifactId>flink-sql-connector-mysql-cdc</artifactId>
<version>2.2.1</version>
</dependency>
That's because u are using the sql dependency flink-sql-connector-mysql-cdc which includes all transitive dependencies.
mark org.apache.flink.shaded.guava30.com.google.common.collect.Lists