Nikita Blagodarnyi
Nikita Blagodarnyi
You use incorrect version. Your connector is for HBase 2.x and relies on theese dependencies. And your hbase is older.
Hello Sercan! Thanks for your reply. I didn't manage to reproduce your issue with UI. I used this command to rebuild ranger from my branch and everything works fine in...
Hello @FerArribas14! Please refer to the PR description. I've been building/testing it with different Hadoop versions, including 3.3.6. Version built against 3.3.6 now works in our production environment.
`spark.[driver|executor].extraClassPath` should be a semicolon-separated list of local jars with absolute local paths. spark-submit silently ignores errors in this config. That's why spark cannot find mentioned class in its classpath....
@radhikabajaj123 note that this local jar (with local path) should be present on all worker nodes of your cluster.
@ramyadass please carefully read all the comments above. `extraClassPath` should be a local path, not hdfs.
Does this local path/file `/home/hadoop/.ivy2/jars/comet-spark-spark3.5_2.12-0.4.0.jar` exist on **all** affected workers (cluster machines where spark drivers/executors can be run on)?
I faced the same issue when tried to set hdfs locations for extra classpaths like `spark.[driver|executor].extraClassPath=hdfs:///foo/bar/comet.jar` AFAIU it only supports local files and spark-submit silently ignores errors (like missing jars...