flink-cdc icon indicating copy to clipboard operation
flink-cdc copied to clipboard

cause Could not find any factory for identifier 'sqlserver-cdc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.

Open algorizm opened this issue 2 years ago • 2 comments

Describe the bug(Please use English) A clear and concise description of what the bug is.

Environment :

  • Flink version : 1.14.4
  • Flink CDC version: flink-sql-connector-sqlserver-cdc-2.2.1.jar
  • Database and version: Microsoft SQL Server 2017 (RTM-CU20) (KB4541283) - 14.0.3294.2 (X64)

To Reproduce Steps to reproduce the behavior:

  1. Thes test data :
  2. The test code : I prepare and set the following 3 individual jar package at str_jars variable
  1. flink-connector-jdbc_2.12-1.14.4.jar
  2. flink-sql-connector-sqlserver-cdc-2.2.1.jar
  3. mssql-jdbc-10.2.0.jre8.jar

env_settings = EnvironmentSettings.in_batch_mode() table_env = TableEnvironment.create(env_settings) table_env.get_config().get_configuration().set_string("pipeline.jars", str_jars)

source_ddl = f''' CREATE TABLE news_basic ( NEWS_ID VARCHAR(10), NEWS_TYPE CHAR(1), SERVICE_DT VARCHAR(12), ARTICLE_TITLE VARCHAR(100), ARTICLE_SUMMARY VARCHAR(300), PRIMARY KEY (NEWS_ID) NOT ENFORCED ) WITH ( 'connector' = 'sqlserver-cdc', 'hostname' = 'xxxx', 'port' = '1433', 'username' = 'xx', 'password' = 'rxx', 'database-name' = 'xxx', 'schema-name' = 'dbo', 'table-name' = 'xxx' ) '''

table_env.execute_sql(source_ddl) src = table_env.from_path("news_basic") src.select(col("NEWS_ID"), col("ARTICLE_TITLE")).execute().print() 4. The error : when I execute file using python I causes an error(image below) but, In flink sql-client mode it's working well please, help me :) thank you for your contribution.

스크린샷 2022-05-19 오후 6 49 43 스크린샷 2022-05-19 오후 6 51 12

Additional Description If applicable, add screenshots to help explain your problem.

algorizm avatar May 19 '22 10:05 algorizm

I got the same trouble when I applied to the MySQL CDC ,did you figure it out?

vleyong avatar Jul 01 '22 03:07 vleyong

If you are building with gradle locally, you may wanna try adding

mergeServiceFiles()

at the bottom of build.gradle

1151658391531_ pic

It worked for me when submitting the uber jar to a flink cluster.

hbk671104 avatar Jul 21 '22 08:07 hbk671104

Hey @hbk671104, I added mergeServiceFiles() to my build.gradle file and then ran the following commands:

./gradlew build
./gradlew clean installShadowDist

(as shown here), however, I'm still getting the same error. Any ideas what I might be missing?

jimmy3142 avatar Mar 23 '23 00:03 jimmy3142

Please make sure it is able to be found in org.apache.flink.table.factories.Factory file.

ruanhang1993 avatar Jun 30 '23 07:06 ruanhang1993

If you are building with gradle locally, you may wanna try adding

mergeServiceFiles()

at the bottom of build.gradle

1151658391531_ pic

It worked for me when submitting the uber jar to a flink cluster.

For maven projects, add a transformer to maven-shade-plugin config. Example:

			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-shade-plugin</artifactId>
				<version>3.1.1</version>
				<executions>
					<!-- Run shade goal on package phase -->
					<execution>
						<phase>package</phase>
						<goals>
							<goal>shade</goal>
						</goals>
						<configuration>
							<!-- something else -->
							<transformers>
								<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
							</transformers>
						</configuration>
					</execution>
				</executions>
			</plugin>

Ref: https://maven.apache.org/plugins/maven-shade-plugin/examples/resource-transformers.html

craynic avatar Apr 19 '24 03:04 craynic