flink-cdc
flink-cdc copied to clipboard
cause Could not find any factory for identifier 'sqlserver-cdc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
Describe the bug(Please use English) A clear and concise description of what the bug is.
Environment :
- Flink version : 1.14.4
- Flink CDC version: flink-sql-connector-sqlserver-cdc-2.2.1.jar
- Database and version: Microsoft SQL Server 2017 (RTM-CU20) (KB4541283) - 14.0.3294.2 (X64)
To Reproduce Steps to reproduce the behavior:
- Thes test data :
- The test code : I prepare and set the following 3 individual jar package at str_jars variable
- flink-connector-jdbc_2.12-1.14.4.jar
- flink-sql-connector-sqlserver-cdc-2.2.1.jar
- mssql-jdbc-10.2.0.jre8.jar
env_settings = EnvironmentSettings.in_batch_mode() table_env = TableEnvironment.create(env_settings) table_env.get_config().get_configuration().set_string("pipeline.jars", str_jars)
source_ddl = f''' CREATE TABLE news_basic ( NEWS_ID VARCHAR(10), NEWS_TYPE CHAR(1), SERVICE_DT VARCHAR(12), ARTICLE_TITLE VARCHAR(100), ARTICLE_SUMMARY VARCHAR(300), PRIMARY KEY (NEWS_ID) NOT ENFORCED ) WITH ( 'connector' = 'sqlserver-cdc', 'hostname' = 'xxxx', 'port' = '1433', 'username' = 'xx', 'password' = 'rxx', 'database-name' = 'xxx', 'schema-name' = 'dbo', 'table-name' = 'xxx' ) '''
table_env.execute_sql(source_ddl) src = table_env.from_path("news_basic") src.select(col("NEWS_ID"), col("ARTICLE_TITLE")).execute().print() 4. The error : when I execute file using python I causes an error(image below) but, In flink sql-client mode it's working well please, help me :) thank you for your contribution.


Additional Description If applicable, add screenshots to help explain your problem.
I got the same trouble when I applied to the MySQL CDC ,did you figure it out?
If you are building with gradle locally, you may wanna try adding
mergeServiceFiles()
at the bottom of build.gradle
It worked for me when submitting the uber jar to a flink cluster.
Hey @hbk671104, I added mergeServiceFiles()
to my build.gradle file and then ran the following commands:
./gradlew build
./gradlew clean installShadowDist
(as shown here), however, I'm still getting the same error. Any ideas what I might be missing?
Please make sure it is able to be found in org.apache.flink.table.factories.Factory
file.
If you are building with gradle locally, you may wanna try adding
mergeServiceFiles()
at the bottom of build.gradle
It worked for me when submitting the uber jar to a flink cluster.
For maven projects, add a transformer to maven-shade-plugin config. Example:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.1</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<!-- something else -->
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
Ref: https://maven.apache.org/plugins/maven-shade-plugin/examples/resource-transformers.html