dinky icon indicating copy to clipboard operation
dinky copied to clipboard

[Bug] [dlink-executor-0.6.6.jar] Dlink cannot link to Hive. Seem like a Kerberos issue

Open wuuu0 opened this issue 3 years ago • 3 comments

Search before asking

  • [X] I had searched in the issues and found no similar issues.

What happened

以下语句,在 Dlink 之外通过 sql-client 可以执行成功,但在 Dlink 创建的作业中就报了以下错误。考虑是 Kerberos 认证的问题(公司采用 kinit 的方式,flink-conf 中的 Kerberos 相关参数为默认值),但不是很理解为什么单纯的 sql-client 执行没问题,但用 Dlink 就不行。

CREATE CATALOG myhive
  WITH (
    'type' = 'hive',
    'default-database' = 'default',
    'hive-conf-dir' = '/etc/hive/conf'
  );

USE CATALOG myhive;

SHOW tables

sql-client 执行:

sql-client 执行

Dlink 提交作业执行:

[dlink] 2022-07-27 18:09:54 CST INFO  org.apache.flink.table.catalog.hive.HiveCatalog 259 createHiveConf - Setting hive conf dir as /etc/hive/conf
[dlink] 2022-07-27 18:09:54 CST WARN  org.apache.hadoop.util.NativeCodeLoader 60 <clinit> - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[dlink] 2022-07-27 18:09:54 CST INFO  org.apache.flink.table.catalog.hive.HiveCatalog 221 <init> - Created HiveCatalog 'myhive'
[dlink] 2022-07-27 18:09:54 CST INFO  org.apache.hadoop.hive.metastore.HiveMetaStoreClient 308 getIfClientFilterEnabled - HMS client filtering is enabled.
[dlink] 2022-07-27 18:09:54 CST INFO  org.apache.hadoop.hive.metastore.HiveMetaStoreClient 472 open - Trying to connect to metastore with URI thrift://datatest1:9083
[dlink] 2022-07-27 18:09:54 CST ERROR org.apache.thrift.transport.TSaslTransport 313 open - SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed
...
        at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:701) [flink-table-api-java-uber-1.15.0.jar:1.15.0]
        at com.dlink.executor.Executor.executeSql(Executor.java:250) [dlink-executor-0.6.6.jar!/:?]
        at com.dlink.job.JobManager.executeSql(JobManager.java:271) [dlink-core-0.6.6.jar!/:?]
        at com.dlink.service.impl.StudioServiceImpl.executeFlinkSql(StudioServiceImpl.java:141) [classes!/:?]
        at com.dlink.service.impl.StudioServiceImpl.executeSql(StudioServiceImpl.java:130) [classes!/:?]
        at com.dlink.controller.StudioController.executeSql(StudioController.java:68) [classes!/:?]
        at com.dlink.controller.StudioController$$FastClassBySpringCGLIB$$e6483d87.invoke(<generated>) [classes!/:?]
        at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218) [spring-core-5.3.15.jar!/:5.3.15]
        at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:783) [spring-aop-5.3.15.jar!/:5.3.15]
        at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) [spring-aop-5.3.15.jar!/:5.3.15]
        at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:753) [spring-aop-5.3.15.jar!/:5.3.15]
        at org.springframework.aop.framework.adapter.AfterReturningAdviceInterceptor.invoke(AfterReturningAdviceInterceptor.java:57) [spring-aop-5.3.15.jar!/:5.3.15]
        at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) [spring-aop-5.3.15.jar!/:5.3.15]
        at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:753) [spring-aop-5.3.15.jar!/:5.3.15]
        at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:97) [spring-aop-5.3.15.jar!/:5.3.15]
        at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) [spring-aop-5.3.15.jar!/:5.3.15]
        at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:753) [spring-aop-5.3.15.jar!/:5.3.15]
        at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:698) [spring-aop-5.3.15.jar!/:5.3.15]
        at com.dlink.controller.StudioController$$EnhancerBySpringCGLIB$$3c120105.executeSql(<generated>) [classes!/:?]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_202]
        ....
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
        at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) ~[?:1.8.0_202]
        at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) ~[?:1.8.0_202]
        at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[?:1.8.0_202]
        at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) ~[?:1.8.0_202]
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[?:1.8.0_202]
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_202]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_202]
        ....

What you expected to happen

能通过 Dlink 连接到 Hive

How to reproduce

我的环境: Hadoop:3.0.0-cdh.6.3.4 Flink:1.15.0 Dlink:0.6.6

我的 Plugins: 我的 Plugins

Anything else

No response

Version

0.6.6

Are you willing to submit PR?

  • [ ] Yes I am willing to submit a PR!

Code of Conduct

wuuu0 avatar Jul 27 '22 10:07 wuuu0

请问,是什么提交方式,yarn-session 吗,还是 perjob 或者 application

aiwenmo avatar Jul 27 '22 15:07 aiwenmo

请问,是什么提交方式,yarn-session 吗,还是 perjob 或者 application

yarn-session

wuuu0 avatar Jul 28 '22 00:07 wuuu0

yarn-session 目前还需要自己改造代码来实现 dinky 的 kerberos 认证。 目前有一种方式:可以在 flink-conf 中添加 kerberos 配置然后使用 per-job 或 application 模式提交任务,即可正常运行。

aiwenmo avatar Jul 28 '22 01:07 aiwenmo

yarn-session 目前还需要自己改造代码来实现 dinky 的 kerberos 认证。 目前有一种方式:可以在 flink-conf 中添加 kerberos 配置然后使用 per-job 或 application 模式提交任务,即可正常运行。

请问有没有具体案例,感谢!!!或是否支持standalone模式的kerberos认证的HIVE?

JimmyWang0 avatar Jul 26 '23 10:07 JimmyWang0