terraform-provider-databricks
terraform-provider-databricks copied to clipboard
[ISSUE] Issue with `databricks_external_location` resource
Configuration
- Azure Databricks Premium SKU
- A service principal does the terraform deployment with the following RBAC roles:
- On the resource group where the Databricks workspace is deployed : Contributor, User Access Administrator
- On the subscription level (to cause inheritence in the managed resource group) a customer role with permission for the action
Microsoft.Databricks/accessConnectors/read
- This service principal exists in both the databricks workspace and databricks account.
- Databricks workspace provider, invoked using the workspace url
resource "databricks_external_location" "catalog_metacontainer" {
name = "${local.env}-catalog"
url = format("abfss://%s@%s.dfs.core.windows.net/",
var.datalake_containers.catalog.name,
var.datalake_containers.catalog.storage_account_name
)
credential_name = data.databricks_storage_credential.unity-connector.name # Default unity catalog connector created in workspace
owner = "<admin_GROUP>" # Does not contain the service principal that deploys this resource
}
resource "databricks_grants" "catalog_external_location" {
external_location = databricks_external_location.catalog_metacontainer.id
grant {
principal = local.sp_name
privileges = ["CREATE_EXTERNAL_VOLUME", "CREATE_MANAGED_STORAGE", "BROWSE"]
}
}
data "azurerm_databricks_access_connector" "unity_connector" {
name = "unity-catalog-access-connector"
resource_group_name = azurerm_databricks_workspace.general.managed_resource_group_name
}
Expected Behavior
An external location is created and accessible by terraform state for further use (such as creating a catalog at this location).
Actual Behavior
An external location is created but an error prevents further deployment. Subsequently, the external resource is unable to have its state refreshed by terraform plan.
The error indicates the CREATE FOREIGN CATALOG permission is required. This permission is only present as a connection grant which should not be applied as a) a catalog is not being created and b) ADLS gen2 is cloud storage and should therefore be of the regular type.
When deleting the external location using the databricks UI, the following warning is presented despite no catalog, schemas or tables having been registered to the external location.
Steps to Reproduce
- Import a new databricks workspace (vnet injection, backend private link) to terraform state
- Import an Azure storage account gen2 with hierarchical namespace enabled to terraform state and a container created
terraform applythe above terraform script
Terraform and provider versions
opentofu 1.6.2 databricks/databricks 1.39.0
Debug Output
Important Factoids
Would you like to implement a fix?
@SebGay the issue is that the sp that is executing TF does not have permission to read the external location. The error is misleading which hides the root cause
@nkvuong thank you very much for your help. I will leave it to your discretion to close the issue or if a fix with a more precise error and documentation note is due.
For others - the issue was specifically that by specifying the owner as a group that did not contain the service principal, the service principal lacked the READ_FILES grant that is required for validation and further usage. Instead I removed the owner argument and simply did an external location grant of ALL_PRIVILEGES for that group.