databricks-cli icon indicating copy to clipboard operation
databricks-cli copied to clipboard

(Legacy) Command Line Interface for Databricks

Results 140 databricks-cli issues
Sort by recently updated
recently updated
newest added

It seems that when trying to run a notebook JOB in Azure Databricks with custom parameters, passed in from the Databricks CLI as a JSON string, while using a Windows...

bug
notebooks

We support export in HTML but not import. HTML format doesn't need `--language`, which is a required parameter.

triage
notebooks

I'm trying to run a job from CLI and I'm able to use a command like this: databricks jobs run-now --job-id 1 --notebook-params '{"widget1": "widget1value","widget2": "widget2value"}' This gets to be...

feature
notebooks

I propose to add an additional option to workspace so that the user doesn't need to know which type of object exists at the given path. When executing a "workspace...

feature
UX
notebooks

For example, if my local filesystem looks like ``` - example - a.py - b.py ``` And remote looks like ``` - example - a.py ``` `databricks workspace import_dir ./example...

feature
UX
notebooks

Hi there I'm in the process of using the cli to create and edit cluster policies and I noticed that the _create_policy_ and _delete_policy_ methods are referencing a _policy_name_ attribute...

bug
clusters

It seems that "policy_id" is not supported when setting up a new cluster. A raw API call to api/2.0/clusters/create works with the below config: `{ "cluster_name": "dev-cluster", "spark_version": "8.4.x-scala2.12", "node_type_id":...

bug
clusters

According to the Databricks 2.0 API documentation, `init_scripts` are supported as an option when creating clusters (https://docs.databricks.com/dev-tools/api/latest/clusters.html). It looks like this kwarg was removed here: https://github.com/databricks/databricks-cli/pull/325 accidentally due to some...

bug
clusters

Hi, we have configured our infrastructure by terraform, now we want to config GitLab integration with databriks to automate notebook and job deployment. I sow that now this step is...

feature
repos

The `get_repo_id` function [doesn't propagate underlying error message](https://github.com/databricks/databricks-cli/blob/master/databricks_cli/repos/api.py#L46), effectively hiding the reason for Repo ID not found, making it very hard to debug issues, for example, when access to workspace...

feature
repos