databricks-cli
databricks-cli copied to clipboard
(Legacy) Command Line Interface for Databricks
It looks like the proper way to specify cluster-scoped init_scripts is when creating clusters: https://docs.databricks.com/clusters/init-scripts.html It looks like it was added in this PR #196 but only to `edit_cluster`. In...
The methods available for Clusters in the REST API list here: https://docs.azuredatabricks.net/api/latest/clusters.html include the "events" one https://docs.azuredatabricks.net/api/latest/clusters.html#events But in databricks-cli doesn't have this command implemented It would be nice to...
I have a python script that pulls a ton of data into s3 and I created a notebook in databricks to turn it into a parquet file. I was wondering...
# Actual output Deleting a non existing file outputs "Delete finished successfully" # Expected output Error: b'{"error_code":"RESOURCE_DOES_NOT_EXIST","message":"No file or directory exists on path dbfs:/."}'
➜ ~ `dbfs ls -h` `Usage: dbfs ls [OPTIONS] [DBFS_PATH]...` `List files in DBFS.` `Options:` `--absolute Displays absolute paths.` `-l Displays full information including size and file type.` `--debug Debug...
`dbfs cp -r testSrc testDst/` (where `testSrc` is a directory) seems to behave like `dbfs cp -r testSrc/* testDst/`. To be consistent with `cp` it should probably behave like `dbfs...
Requiring `-l` is silly (and, in fact, ignored, as expected) in this use case: ``` databricks workspace import -f DBC -l SCALA /path/to/some.dbc ``` The entire DBC is loaded, without...
dbfs cp does not seem to support wildcards. Any plans to support wildcards, especially for files which are on the remote dbfs.
$ dbfs cp -r dbfs:/andrew/sample-results.csv . [11:56:56] The host destination ./sample-results.csv already exists. You should provide the --overwrite flag.
Currently when I have to export to directory from one workspace to another, if I have to just ignore one file inside that folder, then i have to write commands...