shc
shc copied to clipboard
org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the locations
Params
spark-submit
--conf "spark.ui.port=4042"
--conf "spark.sql.codegen=true"
--conf "hbase.zookeeper.quorum=dmp1,dmp2,dmp3"
--conf "zookeeper.znode.parent=/hbase-unsecure"
--class test.Example
--master local[2]
--name test
--files /etc/hadoop/conf/core-site.xml,/etc/hbase/conf/hbase-site.xml
--jars /usr/hdp/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar,/usr/hdp/current/hbase-client/lib/hbase-client.jar,/usr/hdp/current/hbase-client/lib/hbase-common.jar,/usr/hdp/current/hbase-client/lib/hbase-server.jar,/usr/hdp/current/hbase-client/lib/guava-12.0.1.jar,/usr/hdp/current/hbase-client/lib/hbase-protocol.jar,/usr/hdp/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar,/root/lonly/shc-core-1.1.1-2.1-s_2.11.jar
/home/test/testshc_2.11-0.1.jar
Error
Maybe you want to try: copy hbase-site.xml to spark conf directory.
Is there anyway to set hbase properties programmatically instead of copy hbase-site.xml over? What if I want to connect to multiple hbase clusters?
@thewilliamzhang Yes. You can put hbase properties into a config file or json format. Then, as this example shows here, you can pass config file (HBaseRelation.HBASE_CONFIGFILE) or json (HBaseRelation.HBASE_CONFIGURATION) into option()
function.
@weiqingy i will copy hbase-site.xml to spark conf directory, but don't work.
my env: spark, hadoop, hbase also open kerberos 。 how i do it ?
can give me your email ?