sohurdc
sohurdc
BeeLine client can get an error, But the existing job monitoring is all based on the YARN API. Is there any plan to support correcting job status in the future?...
Here is my hive policy and hdfs policy,I have add the hive and hdfs policy, But no use
Thank you very much, you’re right — the Ranger Hive policy also needs to include the URL path, but Hive itself doesn’t require this. Is this a special permission configuration...
But I have new error: set kyuubi.operation.language=SCALA; val in = spark.read.parquet("/user/bdwh/panther/dwd/dwd_panther_mr_eventlog/dt=20250331/hr=16"); in.show; then I got error bellow: "response" : "org.apache.kyuubi.plugin.spark.authz.AccessControlException: Permission denied: user [bdwh] does not have [read] privilege on...
hdfs policy is no use: kyuubi scala mode, set kyuubi.operation.language=SCALA; val in = spark.read.parquet("/user/bdwh/panther/dwd/dwd_panther_mr_eventlog/dt=20250331/hr=16"); in.show; It’s strange — in.count works and returns a value, but in.show throws an error saying...