bdutil icon indicating copy to clipboard operation
bdutil copied to clipboard

[DEPRECATED] Script used to manage Hadoop and Spark instances on Google Compute Engine

Results 32 bdutil issues
Sort by recently updated
recently updated
newest added

``` gsutil md gs://cloudgenius bdutil -p beacloudgenius -b cloudgenius -P cg -z us-central1-f -e /home/user/bdutil/platforms/hdp/ambari_env.sh generate_config my.sh cat my.sh import_env /home/user/bdutil/platforms/hdp/ambari_env.sh PROJECT=beacloudgenius CONFIGBUCKET=cloudgenius PREFIX=cg GCE_ZONE=us-central1-f GCE_MASTER_MACHINE_TYPE=n1-standard-4 bdutil -e my.sh deploy...

Hi - I have built a gce structure using ./bdutil deploy --bucket anintelclustergen1-m-disk -n 2 -P anintelcluster -e extensions/spark/spark_on_yarn_env.sh. In the bucket paraments, both in command and bdutil_evn.sh, I have...

Hi, I'm trying to deploy HDP 2.3 with bdutil. I've set these configuration values ``` AMBARI_REPO="http://public-repo-1.hortonworks.com/ambari/centos6/2.x/updates/2.0.1/ambari.repo" AMBARI_STACK_VERSION='2.3' ``` in `ambari.conf`. The deployment fails with ``` Mon Jun 29 10:26:52 CEST...

I'm using google trail account. when i execute the command `./bdutil -e platforms/hdp/ambari_env.sh deploy` it throwed the following error on master and worker nodes. `'hadoop-m' not yet sshable (1); sleeping...

Is there a way to pass metadata to the provisioned instances? In particular I'd like to pass a "startup-script-url" metadata parameter to the instances that bdutil creates: `--metadata startup-script-url=startup-script-url=gs://bucket/bootstrap.sh` so...

Most extensions are only supported on hadoop 1, are there plans to extend support to Hadoop 2 ?

This isn't meant as a criticism, as I realise there are 1,000 possible things that could be going wrong, but this script seems to only successfully deploy a cluster in...

I got this error when deploying bdutil with Ambari https://gist.github.com/meodorewan/ab0d48e8e32a64732d5c it seems that I cant ssh to my instances with " gcloud --project=abivin-clusters --quiet --verbosity=info compute ssh hadoop-t-m --command=sudo su...

Right now `spark.eventLog.dir` gets set to a GCS path regardless of what DEFAULT_FS is set for deployment; this means if a deployment intentionally disables GCS accessibility, e.g. by removing external...

The -f/--force flag is supposed to skip all confirmation prompts, but it doesn't correctly skip the prompts for running as root or for overwriting an existing config in generate_config.