flink-cdc icon indicating copy to clipboard operation
flink-cdc copied to clipboard

Compatibility problem with Flink 1.14.*

Open anavrotski opened this issue 3 years ago • 14 comments

java.lang.NoClassDefFoundError: org/apache/flink/shaded/guava18/com/google/common/util/concurrent/ThreadFactoryBuilder

Environment :

  • Flink version : 1.14.3
  • Flink CDC version: 2.2.0
  • Database and version: MySQL 8.0.*

To Reproduce Steps to reproduce the behavior: Class com.ververica.cdc.debezium.DebeziumSourceFunction in com.ververica:flink-connector-debezium:2.2.0 has dependency (import) from org.apache.flink.shaded.guava18.com.google.common.util.concurrent.ThreadFactoryBuilder which is not available in Flink 1.14.* (it was used in Flink 1.13.*).

anavrotski avatar Mar 27 '22 14:03 anavrotski

Is there a temporary fix for this issue?

swetakala avatar Mar 28 '22 16:03 swetakala

+1

springMoon avatar Mar 29 '22 05:03 springMoon

@swetakala you can consider building fink-cdc-connectors 2.2.0 source code using flink-shaded-guava-30-*.jar, for more details see https://blog.csdn.net/weixin_40455124/article/details/122247436

jackiehff avatar Mar 30 '22 12:03 jackiehff

use stream api with using pom dependency I encoutered same problem with flink 1.14.3 which using 30 shaded guava. job run on cluster will fail since it cannot find shaded 18 guava.

rinkako avatar Mar 31 '22 09:03 rinkako

shaded guava is the dependency for both flink-cdc-connectors 2.2.0 and flink streaming 1.14.4 , while these two use different version.

When using guava 18

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flink/shaded/guava30/com/google/common/collect/Lists at org.apache.flink.streaming.api.transformations.LegacySinkTransformation.getTransitivePredecessors(LegacySinkTransformation.java:116)

AwesomeArthurLi avatar Apr 25 '22 01:04 AwesomeArthurLi

I have this same problem with the mongo connector! Will there be a fix?

pedroyzkrak avatar May 07 '22 12:05 pedroyzkrak

how do we build job and package jar, so that it can be deploy it in a cluster after making changes on source in CI process?

swetakala avatar May 24 '22 03:05 swetakala

+1 ,mark

pmtmm avatar May 31 '22 05:05 pmtmm

Can we contribute and fix the issue? Can someone guide me on how we can start off?

swetakala avatar May 31 '22 13:05 swetakala

We are facing the same problem on my team with the connector MongoDB. Anyone knows if will there be a fix? Or any kind of plan?

Environment:

Flink version: 1.14.* Flink CDC version: 2.2.1 Database and version: MongDB 4.4.*

jorgeOliveiraTD avatar Jun 02 '22 10:06 jorgeOliveiraTD

+1 mark

nyingping avatar Jun 13 '22 07:06 nyingping

+1

cyongk avatar Jul 07 '22 12:07 cyongk

is this problem solved

jacentsao avatar Jul 25 '22 01:07 jacentsao

is there a ticket opened for this issue? Or shall I open one?

swetakala avatar Aug 24 '22 13:08 swetakala

Any updates on this?

Found this issue which suggests to use the "SQL-jar" instead.

My guess is that this means com.ververica:flink-sql-connector-mongodb-cdc (in your case flink-sql-connector-mysql-cdc) which fixes the exception, but I'm completely new to CDC and still have to find out if this introduces any consequences. (I want to use the DataStream API).

Only found a FAQ entry about the SQL jar: It's the fat jar with all dependencies.

hb0 avatar Oct 07 '22 11:10 hb0

I recently tried to use Flink 1.14.3 and with cdc 2.1.1 release, I did not get any compatibility issues. May be later versions had fixed. Here are the list of dependencies I added:

<dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-jdbc_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-runtime-web_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java-bridge_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-common</artifactId>
            <version>1.13.0</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>com.ververica</groupId>
            <artifactId>flink-connector-mysql-cdc</artifactId>
            <version>2.2.1</version>
        </dependency>
        <dependency>
            <groupId>com.ververica</groupId>
            <artifactId>flink-sql-connector-mysql-cdc</artifactId>
            <version>2.2.1</version>
        </dependency>

swetakala avatar Oct 07 '22 13:10 swetakala

That's because u are using the sql dependency flink-sql-connector-mysql-cdc which includes all transitive dependencies.

hb0 avatar Oct 10 '22 08:10 hb0

mark org.apache.flink.shaded.guava30.com.google.common.collect.Lists

GdHuni avatar Nov 01 '22 03:11 GdHuni