dinky icon indicating copy to clipboard operation
dinky copied to clipboard

[Bug] [dinky-admin] flink on iceburg task check failed casued by add customer unuseful

Open gitfortian opened this issue 9 months ago • 1 comments

Search before asking

  • [X] I had searched in the issues and found no similar issues.

What happened

6dc39f74aa95c1454e5736a06b57448

Exception in executing FlinkSQL:
     CREATE CATALOG iceberg_catalog WITH (
      'type'='iceberg',
      'catalog-type'='hadoop',
      'warehouse'='hdfs:///user/security_analysis/catalog/iceberg'     )

java.lang.IllegalArgumentException: Cannot initialize Catalog implementation org.apache.iceberg.hadoop.HadoopCatalog: Cannot find constructor for interface org.apache.iceberg.catalog.Catalog
    Missing org.apache.iceberg.hadoop.HadoopCatalog [java.lang.ClassNotFoundException: org.apache.iceberg.hadoop.HadoopCatalog]
    at org.apache.iceberg.CatalogUtil.loadCatalog(CatalogUtil.java:240)
    at org.apache.iceberg.flink.CatalogLoader$HadoopCatalogLoader.loadCatalog(CatalogLoader.java:86)
    at org.apache.iceberg.flink.FlinkCatalog.<init>(FlinkCatalog.java:114)
    at org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:166)
    at org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:139)
    at org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:413)
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1426)
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1172)
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:730)
    at org.dinky.executor.DefaultTableEnvironment.executeSql(DefaultTableEnvironment.java:257)
    at org.dinky.executor.Executor.executeSql(Executor.java:208)
    at org.dinky.explainer.Explainer.explainSql(Explainer.java:213)
    at org.dinky.job.JobManager.explainSql(JobManager.java:410)
    at org.dinky.service.task.FlinkSqlTask.explain(FlinkSqlTask.java:57)
    at org.dinky.service.impl.TaskServiceImpl.explainTask(TaskServiceImpl.java:478)
    at org.dinky.service.impl.TaskServiceImpl$$FastClassBySpringCGLIB$$22087f7c.invoke(<generated>)
    at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218)
    at org.springframework.aop

What you expected to happen

check passed

How to reproduce

create a flink sql task as flower:

ADD CUSTOMJAR 'rs:/connectors/iceberg-flink-runtime-1.16-1.5.0.jar';
CREATE CATALOG iceberg_catalog WITH (
  'type'='iceberg',
  'catalog-type'='hadoop',
  'warehouse'='hdfs:///user/security_analysis/catalog/iceberg'
);

USE CATALOG iceberg_catalog;
--  -- create a word count table
CREATE TABLE if not exists `iceberg_catalog`.`default`.word_count (
    word STRING
);

CREATE  TABLE word_table (
    word STRING
) WITH (
    'connector' = 'datagen'
);
-- paimon requires checkpoint interval in streaming mode

-- write streaming data to dynamic table
INSERT INTO `iceberg_catalog`.`default`.word_count SELECT word FROM word_table;

and then click the check butten

Anything else

No response

Version

1.0.1

Are you willing to submit PR?

  • [ ] Yes I am willing to submit a PR!

Code of Conduct

gitfortian avatar Apr 29 '24 09:04 gitfortian

原因待排查, 请先使用添加依赖到extends/flink1.16 && 到 flink 相关目录中 的方式

Zzm0809 avatar Apr 29 '24 09:04 Zzm0809