cli
cli copied to clipboard
Databricks CLI
I am currently working on a project in the pharmaceutical sector within a GxP-validated environment. Operating under GxP (Good Practice) standards means we must exercise rigorous control over everything that...
## 🥞 Stacked PR Use this [link](https://github.com/databricks/cli/pull/3711/files) to review incremental changes. - [**stack/python-convert-1**](https://github.com/databricks/cli/pull/3711) [[Files changed](https://github.com/databricks/cli/pull/3711/files)] --------- ## Changes ## Why ## Tests
### Describe the issue In the REST API request to create a new job (see [here](https://docs.databricks.com/api/workspace/jobs/create)), it is possible to set the parameter `edit_mode` for a job. However, the related...
### Describe the issue When exporting a dir, a file is encountered that is too large, it's ok to skip but the command fails. ### Steps to reproduce the behavior...
## Changes ## Why ## Tests
## Changes ## Why ## Tests
### Describe the issue I'm bringing schemas under DABs management by adding schema resources to the bundle config and running `databricks bundle bind...` commands to import the existing schemas. This...
### Describe the issue Add volume_path to the output of the volume object: ``` { "catalog_name": "abc_catalog", "schema_name": "abc_schema", "name": "abc_volume", "volume_path": "/Volumes/abc_catalog/abc_schema/abc_volume" } ``` Once this is done, DABs...
I have a databricks app that works well with hardcoded values on the app.yml file. However when i specify a variable on this file, it does not get resolved to...
### Describe the issue When defining a schema in an asset bundle, the schema gets recreated everytime. I poked around a little bit and firstly I found: `force_destroy: true` in...