chunjun icon indicating copy to clipboard operation
chunjun copied to clipboard

This error is reported when the kerberos hdfs cluster writes to a simple authenticated hdfs cluster. Could you please check if the authentication method has not been changed? 这个报错是Kerberos hdfs集群写入到simple认证的hdfs集群会报这个错误,麻烦看看,是不是认证方式还没有更改过来?

Open YangQian-99 opened this issue 2 years ago • 2 comments

Search before asking

  • [ ] I had searched in the issues and found no similar question.

  • [ ] I had googled my question but i didn't get any help.

  • [ ] I had read the documentation: ChunJun doc but it didn't help me.

Description

This error is reported when the kerberos hdfs cluster writes to a simple authenticated hdfs cluster. Could you please check if the authentication method has not been changed? 这个报错是Kerberos hdfs集群写入到simple认证的hdfs集群会报这个错误,麻烦看看,是不是认证方式还没有更改过来? 具体报错日志如下: 2023-04-22 22:59:17,294 INFO org.apache.hadoop.io.compress.CodecPool [] - Got brand-new compressor [.snappy] 2023-04-22 22:59:37,528 INFO com.dtstack.chunjun.connector.hdfs.source.HdfsParquetInputFormat [] - subtask input close finished 2023-04-22 22:59:37,536 INFO com.dtstack.chunjun.connector.hdfs.sink.HdfsParquetOutputFormat [] - taskNumber[0] close() 2023-04-22 22:59:37,536 INFO com.dtstack.chunjun.connector.hdfs.sink.HdfsParquetOutputFormat [] - close:Current block writer record:0 2023-04-22 22:59:37,784 INFO org.apache.hadoop.io.retry.RetryInvocationHandler [] - Exception while invoking addBlock of class ClientNamenodeProtocolTranslatorPB over /:8020. Trying to fail over immediately. java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "QASHV144940.hostname.com/"; destination host is: "QASHV144935.hostname.com":8020; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:776) ~[flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] at org.apache.hadoop.ipc.Client.call(Client.java:1480) ~[flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] at org.apache.hadoop.ipc.Client.call(Client.java:1413) ~[flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) ~[flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] at com.sun.proxy.$Proxy27.addBlock(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:418) ~[flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) ~[flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) [flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] at com.sun.proxy.$Proxy28.addBlock(Unknown Source) [?:?] at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1588) [flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1373) [flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554) [flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] Caused by: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections. at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:755) ~[flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:376) ~[flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] at org.apache.hadoop.ipc.Client.getConnection(Client.java:1529) ~[flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] at org.apache.hadoop.ipc.Client.call(Client.java:1452) ~[flink-shaded-hadoop-2-uber-2.7.5-10.0.jar:2.7.5-10.0] ... 14 more

Code of Conduct

YangQian-99 avatar Apr 23 '23 05:04 YangQian-99

@YangQian-99 kerberos集群访问非kerberos集群 core-site.xml加上配置:这种情况的报错日志如下。 Caused by: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.

ipc.client.fallback-to-simple-auth-allowed true

zoudaokoulife avatar Apr 23 '23 08:04 zoudaokoulife

@zoudaokoulife 请问必须要修改集群的core-site.xml文件吗?在chunjun任务中添加配置有用吗? 'properties.ipc.client.fallback-to-simple-auth-allowed'='true',

YangQian-99 avatar Apr 23 '23 08:04 YangQian-99