dinky icon indicating copy to clipboard operation
dinky copied to clipboard

[Feature][Module Name] Dinky1.2.4 version is creating a hudi-catalog link minio error S3 authentication problem

Open zhanghengdashuaibi opened this issue 3 months ago • 1 comments

Search before asking

  • [x] I had searched in the issues and found no similar feature requirement.

Description

我的环境是dinky1.2.4 flink1.20.2 hudi是hudi-flink1.20-bundle-1.0.0.jar,我在sql中执行 CREATE CATALOG hudi_catalog WITH ( 'type' = 'hudi', 'catalog.path' = 's3a://hudi-warehouse/', 'mode' = 'dfs', 'default-database' = 'default' );

CREATE TABLE hudi_catalog.default.user_behavior ( id BIGINT, user_id BIGINT, item_id BIGINT, behavior STRING, ts TIMESTAMP(3), dt STRING, PRIMARY KEY (id) NOT ENFORCED ) PARTITIONED BY (dt) WITH ( 'connector' = 'hudi', 'path' = 's3a://hudi-warehouse/user_behavior', 'table.type' = 'COPY_ON_WRITE',

-- 在表级别配置 S3 参数
'fs.s3a.endpoint' = 'http://172.29.0.10:9000',
'fs.s3a.access.key' = 'admin',
'fs.s3a.secret.key' = 'password',
'fs.s3a.path.style.access' = 'true',

-- Hudi 配置
'write.precombine.field' = 'ts',
'write.operation' = 'upsert',
'hoodie.datasource.write.recordkey.field' = 'id'

);

报错Caused by: com.amazonaws.SdkClientException: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY)) at com.amazonaws.auth.EnvironmentVariableCredentialsProvider.getCredentials(EnvironmentVariableCredentialsProvider.java:49) ~[aws-java-sdk-core-1.12.645.jar:?] at org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:177) ~[flink-s3-fs-presto-1.20.1.jar:1.20.1] at com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1269) ~[aws-java-sdk-core-1.12.645.jar:?] at com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:845) ~[aws-java-sdk-core-1.12.645.jar:?] at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:794) ~[aws-java-sdk-core-1.12.645.jar:?] at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:781) ~[aws-java-sdk-core-1.12.645.jar:?] at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:755) ~[aws-java-sdk-core-1.12.645.jar:?] at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:715) ~[aws-java-sdk-core-1.12.645.jar:?] at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:697) ~[aws-java-sdk-core-1.12.645.jar:?] at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:561) ~[aws-java-sdk-core-1.12.645.jar:?] at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:541) ~[aws-java-sdk-core-1.12.645.jar:?] at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5520) ~[aws-java-sdk-s3-1.12.645.jar:?] at com.amazonaws.services.s3.AmazonS3Client.getBucketRegionViaHeadRequest(AmazonS3Client.java:6501) ~[aws-java-sdk-s3-1.12.645.jar:?] at com.amazonaws.services.s3.AmazonS3Client.fetchRegionFromCache(AmazonS3Client.java:6473) ~[aws-java-sdk-s3-1.12.645.jar:?] at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5505) ~[aws-java-sdk-s3-1.12.645.jar:?] at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5467) ~[aws-java-sdk-s3-1.12.645.jar:?] at com.amazonaws.services.s3.AmazonS3Client.listObjectsV2(AmazonS3Client.java:1001) ~[aws-java-sdk-s3-1.12.645.jar:?] at org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$listObjects$11(S3AFileSystem.java:2595) ~[flink-s3-fs-presto-1.20.1.jar:1.20.1] at org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:499) ~[flink-s3-fs-presto-1.20.1.jar:1.20.1] at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:414) ~[flink-s3-fs-presto-1.20.1.jar:1.20.1] at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:377) ~[flink-s3-fs-presto-1.20.1.jar:1.20.1] at org.apache.hadoop.fs.s3a.S3AFileSystem.listObjects(S3AFileSystem.java:2586) ~[flink-s3-fs-presto-1.20.1.jar:1.20.1] at org.apache.hadoop.fs.s3a.S3AFileSystem$ListingOperationCallbacksImpl.lambda$listObjectsAsync$0(S3AFileSystem.java:2153) ~[flink-s3-fs-presto-1.20.1.jar:1.20.1] at org.apache.hadoop.fs.s3a.impl.CallableSupplier.get(CallableSupplier.java:87) ~[flink-s3-fs-presto-1.20.1.jar:1.20.1] at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604) ~[?:1.8.0_452] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_452] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_452] 我想知道 我该怎么配置才能使用hudi-catalog正常使用minio

Use case

No response

Related issues

No response

Are you willing to submit a PR?

  • [x] Yes I am willing to submit a PR!

Code of Conduct

zhanghengdashuaibi avatar Sep 15 '25 02:09 zhanghengdashuaibi

Hello @, this issue has not been active for more than 30 days. This issue will be closed in 7 days if there is no response. If you have any questions, you can comment and reply.

你好 @, 这个 issue 30 天内没有活跃,7 天后将关闭,如需回复,可以评论回复。

github-actions[bot] avatar Nov 01 '25 00:11 github-actions[bot]