terraform-provider-databricks
terraform-provider-databricks copied to clipboard
Databricks Terraform Provider
Hi team, Greetings. There is a situation where we can have a successful `plan`, but the `apply` command can fail because of a misconfiguration regarding RBAC, job or cluster definitions....
We have a fairly standard setup in Azure, where AD-authenticated users in our organisation belong to Azure groups, and we would like to have those users in specific groups automatically...
### Affected Resource(s) https://registry.terraform.io/providers/databricks/databricks/1.38.0/docs/resources/share#object-configuration-block  **data_object_type (Required) - Type of the data object, currently TABLE, SCHEMA, VOLUME, NOTEBOOK_FILE are supported.** However, when using NOTEBOOK_FILE data object type, it results in...
### Use-cases Currently provider allows setting owners for `databricks_catalog` and `databricks_schema` resources, but not for `databricks_sql_table`. Instead, owner value is set to a service principal application ID, that was used...
### Configuration ```hcl resource "databricks_workspace_conf" "this" { provider = databricks.workspace custom_config = { "enableIpAccessLists" : true, "maxTokenLifetimeDays": 90 } } ``` ### Expected Behavior It should be possible to destroy...
The document for the resource databricks_sql_query shows below note "Note: documentation for this resource is a work in progress" . Ref https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/sql_query This note leaves some uncertainty about when the...
### Configuration ```hcl terraform { required_version = ">= 1.0" required_providers { databricks = { source = "databricks/databricks" version = "1.33.0" } } } provider "databricks" { host = "https://accounts.azuredatabricks.net" account_id...
### Use-cases We would like enable [inference tables](https://docs.databricks.com/en/machine-learning/model-serving/enable-model-serving-inference-tables.html) on our model serving endpoints with terraform. It doesn't look like this option is available currently? ### Attempted Solutions No other solutions...
### Use-cases Currently the only way to assign a Unity Catalog Metastore to a Databricks Workspace is to create multiple `databricks_metastore_assignment` `resources, that will add individual workspaces to a resource....
### Expected Behavior Expect the workspace to be successfully created and terraform apply to complete successfully when creating a workspace with front end PrivateLink ### Actual Behavior Although the workspace...