terraform-provider-databricks
terraform-provider-databricks copied to clipboard
[ISSUE] Unable to import databricks_metastore_assignment
Configuration
resource "databricks_metastore_assignment" "this" {
metastore_id = databricks_metastore.this.id
workspace_id = var.workspace_id
}
Expected Behavior
Import should work.
terraform import databricks_metastore_assignment.this <metastore id>/12345678
Actual Behavior
The apply after the import produces a "force new" change.
-/+ resource "databricks_metastore_assignment" "this" {
+ default_catalog_name = "hive_metastore"
~ id = "<metastore id>/12345678" -> (known after apply)
+ workspace_id = 12345678 # forces replacement
# (1 unchanged attribute hidden)
}
Terraform and provider versions
Terraform v1.1.4
on darwin_amd64
+ provider registry.terraform.io/databrickslabs/databricks v0.5.7
Important Factoids
The read path doesn't read the workspace ID, so it either needs to be assumed to be correct (and taken from the compound ID), or read through some auxiliary API call (e.g. current workspace ID).
https://github.com/databrickslabs/terraform-provider-databricks/blob/24ed7fd7673f0db26b1fab6e655eed1a49a9d4b8/catalog/resource_metastore_assignment.go#L36-L40
@pietern - there's another option - nil out the importer property for this resource, https://github.com/databrickslabs/terraform-provider-databricks/blob/master/storage/resource_mount.go#L66-L67, as we didn't document it yet - https://registry.terraform.io/providers/databrickslabs/databricks/latest/docs/resources/metastore_assignment :-)
there's no API that gives a workspace id back. There's X-Databricks-Org-Id response header, though we need to think how to pull it from deeply hidden layer... alternatively, we can ask @adamcain-db to add workspace id to metastore summary reponse π€·
Following up - is this issue still relevant?
Thank you for the feature request! Currently, the team operates in a limited capacity, carefully prioritizing, and we cannot provide a timeline to implement this feature. Please make a Pull Request if you'd like to see this feature sooner, and we'll guide you through the journey.
so the workaround at the moment is to manually edit the .tfstate file. After the import, you can see
"attributes": {
"default_catalog_name": null,
"id": "blahblah/workspaceid",
"metastore_id": "blahblah",
"workspace_id": "null"
}
set workspace_id & default_catalog_name will fix this
so the workaround at the moment is to manually edit the
.tfstatefile. After the import, you can see"attributes": { "default_catalog_name": null, "id": "blahblah/workspaceid", "metastore_id": "blahblah", "workspace_id": "null" }set
workspace_id&default_catalog_namewill fix this
that really helped :) thanks
All resources (except a couple) do support imports generically by just specifying an ID. But we just donβt document the cases of βpaired IDsβ, as it turns out that users can damage their state by using those commands improperly π€·π»ββοΈ
I'm also hitting this issue. I have an assignment in one Terraform project (call it Project A) that I'd like to move to another project (call it Project B).
My IDs take the form <workspace ID>|<metastore ID>, instead of <metastore ID>/<workspace ID>. The examples below show the form, but I've redacted most of the digits to zeroes.
If I run an import in project B with the exact same ID in project A, I get a "not found":
databricks_metastore_assignment.this["dev"]: Refreshing state... [id=2326000000000000|cff32311-0000-0000-0000-000000000000]
β·
β Error: Cannot import non-existent remote object
β
β While attempting to import an existing object to "databricks_metastore_assignment.this[\"dev\"]", the provider detected that no object exists with the
β given id. Only pre-existing objects can be imported; check that the id is correct and that it is associated with the provider's configured region or
β endpoint, or use "terraform apply" to create a new remote object for this resource.
If I run an import with the format described in the workarounds above instead, I still get a "not found":
databricks_metastore_assignment.this["dev"]: Refreshing state... [id=cff32311-0000-0000-0000-000000000000/2326000000000000]
β·
β Error: Cannot import non-existent remote object
β
β While attempting to import an existing object to "databricks_metastore_assignment.this[\"dev\"]", the provider detected that no object exists with the
β given id. Only pre-existing objects can be imported; check that the id is correct and that it is associated with the provider's configured region or
β endpoint, or use "terraform apply" to create a new remote object for this resource.
I then attempted to copy the state file entries from project B to project A. If I run a terraform plan, I get:
Note: Objects have changed outside of Terraform
Terraform detected the following changes made outside of Terraform since the last "terraform apply":
# databricks_metastore_assignment.this["dev"] has been deleted
- resource "databricks_metastore_assignment" "this" {
- default_catalog_name = "hive_metastore" -> null
- id = "2326000000000000|cff32311-0000-0000-0000-000000000000" -> null
- metastore_id = "cff32311-0000-0000-0000-000000000000" -> null
- workspace_id = 2326000000000000 -> null
}
Unless you have made equivalent changes to your configuration, or ignored the relevant attributes using ignore_changes, the following plan may include
actions to undo or respond to these changes.
βββββββββββββββββββββββββββββββββββββββββββ
Terraform used the selected providers to generate the following execution plan. Resource actions are indicated with the following symbols:
+ create
Terraform will perform the following actions:
# databricks_metastore_assignment.this["dev"] will be created
+ resource "databricks_metastore_assignment" "this" {
+ default_catalog_name = "hive_metastore"
+ id = (known after apply)
+ metastore_id = "cff32311-0000-0000-0000-000000000000"
+ workspace_id = 2326000000000000
}
Plan: 1 to add, 0 to change, 0 to destroy.
Any input would be appreciated.
with #2182, importing databricks_metastore_assignment works now, if the provider is defined at account level.