terraform-provider-databricks icon indicating copy to clipboard operation
terraform-provider-databricks copied to clipboard

[ISSUE] databricks_storage_credential: No API found for PUT even with latest provider version 1.34

Open olibal opened this issue 1 year ago • 6 comments

I have exactly the same issue as decribed here, so I don't repeat all details: https://github.com/databricks/terraform-provider-databricks/issues/2697

But I still have this problem with the latest provider version 1.34

Configuration

provider "databricks" {
  host          = "https://accounts.azuredatabricks.net"
  account_id    = <my databricks account id>
  auth_type     = "azure-cli"
  alias = "account"
}

resource "databricks_storage_credential" "datastore_credentials" {
  name = local.uc_datastore_name
  azure_managed_identity {
    access_connector_id = azurerm_databricks_access_connector.dac_datastore.id
  }
  owner   = var.metastore_admin_group_display_name
  comment = "Managed by TF"
}

Steps to Reproduce

$: terraform plan

Terraform will perform the following actions:

  module.db_uc_datastore.databricks_storage_credential.datastore_credentials will be created
  + resource "databricks_storage_credential" "datastore_credentials" {
      + comment      = "Managed by TF"
      + id           = (known after apply)
      + metastore_id = (known after apply)
      + name         = "pocdbw-uc-datastore-gerwc"
      + owner        = "uc-metastore-admins"

      + azure_managed_identity {
          + access_connector_id = "/subscriptions/<my azure subscription id>/resourceGroups/pocdbw-rg-databricks-uc-datastore-gerwc/providers/Microsoft.Databricks/accessConnectors/pocdbw-dac-uc-datastore-gerwc"
          + credential_id       = (known after apply)
        }
    }

Plan: 1 to add, 0 to change, 0 to destroy.

$: terraform apply

Terraform and provider versions

$: terraform --version
Terraform v1.6.6
on linux_amd64
+ provider registry.terraform.io/databricks/databricks v1.34.0
+ provider registry.terraform.io/hashicorp/azurerm v3.87.0

Debug Output

2024-01-19T09:20:50.360+0100 [WARN]  Provider "registry.terraform.io/databricks/databricks" produced an invalid plan for module.db_uc_datastore.databricks_storage_credential.datastore_credentials, but we are tolerating it because it is using the legacy plugin SDK.
    The following problems may be the cause of any confusing errors from downstream operations:
      - .databricks_gcp_service_account: attribute representing nested block must not be unknown itself; set nested attribute values to unknown instead
module.db_uc_datastore.databricks_storage_credential.datastore_credentials: Creating...
2024-01-19T09:20:50.360+0100 [INFO]  Starting apply for module.db_uc_datastore.databricks_storage_credential.datastore_credentials
2024-01-19T09:20:50.361+0100 [DEBUG] module.db_uc_datastore.databricks_storage_credential.datastore_credentials: applying the planned Create change
2024-01-19T09:20:50.361+0100 [DEBUG] provider.terraform-provider-databricks_v1.34.0: setting computed for "databricks_gcp_service_account" from ComputedKeys: timestamp="2024-01-19T09:20:50.361+0100"
2024-01-19T09:20:50.805+0100 [DEBUG] provider.terraform-provider-databricks_v1.34.0: non-retriable error: No API found for 'POST /accounts/<my databricks account id>/metastores/storage-credentials': tf_req_id=e473d849-9fd0-69e5-4f9c-fc60160ec707 @module=databricks tf_provider_addr=registry.terraform.io/databricks/databricks tf_resource_type=databricks_storage_credential tf_rpc=ApplyResourceChange @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/logger/logger.go:33 timestamp="2024-01-19T09:20:50.805+0100"
2024-01-19T09:20:50.806+0100 [DEBUG] provider.terraform-provider-databricks_v1.34.0: POST /api/2.0/accounts/<my databricks account id>/metastores//storage-credentials
> {
>   "credential_info": {
>     "azure_managed_identity": {
>       "access_connector_id": "/subscriptions/<my azure subscription id>/resourceGroups/pocdbw-rg-databricks-uc-datas... (88 more bytes)"
>     },
>     "comment": "Managed by TF",
>     "name": "pocdbw-uc-datastore-gerwc"
>   }
> }
< HTTP/2.0 404 Not Found
< {
<   "error_code": "ENDPOINT_NOT_FOUND",
<   "message": "No API found for 'POST /accounts/<my databricks account id>/metastores/storage-credent... (5 more bytes)"
< }: @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/logger/logger.go:33 @module=databricks tf_req_id=e473d849-9fd0-69e5-4f9c-fc60160ec707 tf_rpc=ApplyResourceChange tf_provider_addr=registry.terraform.io/databricks/databricks tf_resource_type=databricks_storage_credential timestamp="2024-01-19T09:20:50.805+0100"
2024-01-19T09:20:50.806+0100 [ERROR] provider.terraform-provider-databricks_v1.34.0: Response contains error diagnostic: diagnostic_detail="" diagnostic_summary="cannot create storage credential: No API found for 'POST /accounts/<my databricks account id>/metastores/storage-credentials'" tf_proto_version=5.4 tf_provider_addr=registry.terraform.io/databricks/databricks tf_req_id=e473d849-9fd0-69e5-4f9c-fc60160ec707 diagnostic_severity=ERROR tf_rpc=ApplyResourceChange @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/vendor/github.com/hashicorp/terraform-plugin-go/tfprotov5/internal/diag/diagnostics.go:62 tf_resource_type=databricks_storage_credential @module=sdk.proto timestamp="2024-01-19T09:20:50.806+0100"
2024-01-19T09:20:50.809+0100 [DEBUG] State storage *remote.State declined to persist a state snapshot
2024-01-19T09:20:50.809+0100 [ERROR] vertex "module.db_uc_datastore.databricks_storage_credential.datastore_credentials" error: cannot create storage credential: No API found for 'POST /accounts/ab3d1367-c502-4e93-9654-2d0a105a05c5/metastores/storage-credentials'
2024-01-19T09:20:50.809+0100 [DEBUG] states/remote: state read serial is: 63; serial is: 63
2024-01-19T09:20:50.809+0100 [DEBUG] states/remote: state read lineage is: 417b87b9-c2ac-f29e-abe8-8d71b8c20b39; lineage is: 417b87b9-c2ac-f29e-abe8-8d71b8c20b39
╷
│ Error: cannot create storage credential: No API found for 'POST /accounts/<my databricks account id>/metastores/storage-credentials'
│ 
│   with module.db_uc_datastore.databricks_storage_credential.datastore_credentials,
│   on ../../childmodules/db-uc-datastore/db-uc-datastore-conf.tf line 5, in resource "databricks_storage_credential" "datastore_credentials":
│    5: resource "databricks_storage_credential" "datastore_credentials" {
│ 
╵

2024-01-19T09:20:50.904+0100 [DEBUG] provider.stdio: received EOF, stopping recv loop: err="rpc error: code = Unavailable desc = error reading from server: EOF"
2024-01-19T09:20:50.908+0100 [DEBUG] provider: plugin process exited: path=.terraform/providers/registry.terraform.io/databricks/databricks/1.34.0/linux_amd64/terraform-provider-databricks_v1.34.0 pid=63834
2024-01-19T09:20:50.908+0100 [DEBUG] provider: plugin exited

Important Factoids

I'm using Azure Databricks I'm deploying the resource using an Account-level provider The Service Principal deploying the Storage Credential is an Account & Metastore Admin

olibal avatar Jan 19 '24 08:01 olibal

if you want to use account-level provider, then you don't have

provider = databricks.account

in the resource block, so it uses databricks provider attached to the workspace

alexott avatar Jan 19 '24 08:01 alexott

In the issue I had shortend the information.

The configuration is actually executed by a child module which has only 1 databricks provider.

The root module containing the provider configurations passes over only the databricks.account level provider to the child module, so no

provider = databricks.account

should be required in the resource block in the child module.

root module

provider "databricks" {
  host          = "https://accounts.azuredatabricks.net"
  account_id    = "<my databricks account id>"
  auth_type     = "azure-cli"
  alias         = "account"
}


module "db_uc_datastore" {
  source = "./../../childmodules/db-uc-datastore"
  depends_on = [ module.dbw_poc_sandbox ]

  providers = {
    azurerm     = azurerm
    databricks  = databricks.account
  }

}

child module

terraform {
  required_providers {
    azurerm = {
      source  = "registry.terraform.io/hashicorp/azurerm"
      version = "~> 3.87"
    }

    databricks = {
      source  = "databricks/databricks"
      version = "~>1.34"
    }

  }
  required_version = ">= 1.4.6"
}

resource "databricks_storage_credential" "datastore_credentials" {
  name = local.uc_datastore_name
  azure_managed_identity {
    access_connector_id = azurerm_databricks_access_connector.dac_datastore.id
  }
  owner   = var.metastore_admin_group_display_name
  comment = "Managed by TF"
}

olibal avatar Jan 19 '24 09:01 olibal

Same issue with AWS, provider version 1.34.0. Minimal repro:

variable "unity_catalogs" {
  type    = set(string)
  default = ["staging"]
}

# Account level catalog owners
resource "databricks_service_principal" "catalog" {
  for_each     = var.unity_catalogs
  display_name = "unity_catalog_${each.value}_sp"
  provider     = databricks.account
}

resource "databricks_storage_credential" "catalog_staging" {
  name  = "catalog_staging"
  owner = databricks_service_principal.catalog["staging"].application_id
  aws_iam_role {
    role_arn = "arn:aws:iam::1234567890:role/MyRole-AJJHDSKSDF"
  }
  provider = databricks.account
}

Logs:

2024-01-19T09:31:48.914Z [DEBUG] provider.terraform-provider-databricks_v1.34.0: POST /api/2.0/accounts/b0834ee0-0ead-4165-9d0a-6f081fdae9bc/metastores//storage-credentials
> {
>   "credential_info": {
>     "aws_iam_role": {
>       "role_arn": "arn:aws:iam::1234567890:role/MyRole-AJJHDSKSDF"
>     },
>     "name": "catalog_staging"
>   }
> }
< HTTP/2.0 404 Not Found
< {
<   "error_code": "ENDPOINT_NOT_FOUND",
<   "message": "No API found for 'POST /accounts/b0834ee0-0ead-4165-9d0a-6f081fdae9bc/metastores/storage-credent... (5 more bytes)"
< }: @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/logger/logger.go:33 @module=databricks tf_provider_addr=registry.terraform.io/databricks/databricks tf_req_id=86fba70a-4dff-e9bc-fd0e-658661bce807 tf_resource_type=databricks_storage_credential tf_rpc=ApplyResourceChange timestamp=2024-01-19T09:31:48.912Z
2024-01-19T09:31:48.914Z [ERROR] provider.terraform-provider-databricks_v1.34.0: Response contains error diagnostic: tf_proto_version=5.4 tf_provider_addr=registry.terraform.io/databricks/databricks tf_req_id=86fba70a-4dff-e9bc-fd0e-658661bce807 tf_rpc=ApplyResourceChange diagnostic_summary="cannot create storage credential: No API found for 'POST /accounts/b0834ee0-0ead-4165-9d0a-6f081fdae9bc/metastores/storage-credentials'" diagnostic_detail= diagnostic_severity=ERROR tf_resource_type=databricks_storage_credential @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/vendor/github.com/hashicorp/terraform-plugin-go/tfprotov5/internal/diag/diagnostics.go:62 @module=sdk.proto timestamp=2024-01-19T09:31:48.912Z
╷
│ Error: cannot create storage credential: No API found for 'POST /accounts/b0834ee0-0ead-4165-9d0a-6f081fdae9bc/metastores/storage-credentials'
│ 
│   with databricks_storage_credential.catalog_staging,
│   on catalogs.tf line 13, in resource "databricks_storage_credential" "catalog_staging":
│   13: resource "databricks_storage_credential" "catalog_staging" {
│ 
╵

where b0834ee0-0ead-4165-9d0a-6f081fdae9bc is our Databricks account ID (value is replaced for security purposes).

donatasm avatar Jan 19 '24 09:01 donatasm

Setting metastore_id fixes ENDPOINT_NOT_FOUND error:

resource "databricks_storage_credential" "catalog_staging" {
  name  = "catalog_staging"
  owner = databricks_service_principal.catalog["staging"].application_id
  metastore_id = databricks_metastore.unity_metastore.id
  aws_iam_role {
    role_arn = "arn:aws:iam::1234567890:role/MyRole-AJJHDSKSDF"
  }
  provider = databricks.account
}

However, setting metastore_id leads to a different error:

Error: cannot create storage credential: Storage Credential '02e010b4-0fdb-46d7-af9f-706e99f2b19b' does not exist.

Logs:

2024-01-19T13:34:51.868Z [DEBUG] provider.terraform-provider-databricks_v1.34.0: POST /api/2.0/accounts/b0834ee0-0ead-4165-9d0a-6f081fdae9bc/metastores/79ba06d5-3c04-4fc1-9df2-a82ac2ec27e4/storage-credentials
> {
>   "credential_info": {
>     "aws_iam_role": {
>       "role_arn": "arn:aws:iam::1234567890:role/MyRole-AJJHDSKSDF"
>     },
>     "name": "catalog_staging"
>   }
> }
< HTTP/2.0 200 OK
< {
<   "credential_info": {
<     "aws_iam_role": {
<       "external_id": "b0834ee0-0ead-4165-9d0a-6f081fdae9bc",
<       "role_arn": "arn:aws:iam::1234567890:role/MyRole-AJJHDSKSDF",
<       "unity_catalog_iam_arn": "arn:aws:iam::414351767826:role/unity-catalog-prod-UCMasterRole-14S5ZJVKOTYTL"
<     },
<     "created_at": 1705671291796,
<     "created_by": "[email protected]",
<     "full_name": "catalog_staging",
<     "id": "02e010b4-0fdb-46d7-af9f-706e99f2b19b",
<     "isolation_mode": "ISOLATION_MODE_OPEN",
<     "metastore_id": "79ba06d5-3c04-4fc1-9df2-a82ac2ec27e4",
<     "name": "catalog_staging",
<     "owner": "[email protected]",
<     "read_only": false,
<     "securable_kind": "STORAGE_CREDENTIAL_AWS_IAM",
<     "securable_type": "STORAGE_CREDENTIAL",
<     "updated_at": 1705671291796,
<     "updated_by": "[email protected]"
<   }
< }: tf_provider_addr=registry.terraform.io/databricks/databricks tf_rpc=ApplyResourceChange @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/logger/logger.go:33 @module=databricks tf_req_id=3be0adc2-8aa6-6a10-3bf7-5f2b15ecde33 tf_resource_type=databricks_storage_credential timestamp=2024-01-19T13:34:51.868Z
2024-01-19T13:34:52.457Z [DEBUG] provider.terraform-provider-databricks_v1.34.0: non-retriable error: Storage Credential '02e010b4-0fdb-46d7-af9f-706e99f2b19b' does not exist.: tf_provider_addr=registry.terraform.io/databricks/databricks tf_resource_type=databricks_storage_credential tf_rpc=ApplyResourceChange @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/logger/logger.go:33 @module=databricks tf_req_id=3be0adc2-8aa6-6a10-3bf7-5f2b15ecde33 timestamp=2024-01-19T13:34:52.457Z
2024-01-19T13:34:52.457Z [DEBUG] provider.terraform-provider-databricks_v1.34.0: PUT /api/2.0/accounts/b0834ee0-0ead-4165-9d0a-6f081fdae9bc/metastores/79ba06d5-3c04-4fc1-9df2-a82ac2ec27e4/storage-credentials/02e010b4-0fdb-46d7-af9f-706e99f2b19b
> {
>   "credential_info": {
>     "aws_iam_role": {
>       "role_arn": "arn:aws:iam::1234567890:role/MyRole-AJJHDSKSDF"
>     },
>     "owner": "0c6b1aa7-c609-4605-9195-710ec90cb478"
>   }
> }
< HTTP/2.0 404 Not Found
< {
<   "details": [
<     {
<       "@type": "type.googleapis.com/google.rpc.RequestInfo",
<       "request_id": "f31c49ab-ab2f-4d13-9275-16df29b08747",
<       "serving_data": ""
<     }
<   ],
<   "error_code": "STORAGE_CREDENTIAL_DOES_NOT_EXIST",
<   "message": "Storage Credential '02e010b4-0fdb-46d7-af9f-706e99f2b19b' does not exist."

For some reason PUT call fails with storage credentials ID 02e010b4-0fdb-46d7-af9f-706e99f2b19b returned by a successful POST call.

Also, I think external_id value returned in POST should be set to a owner service principal id, instead of account id value.

donatasm avatar Jan 19 '24 13:01 donatasm

After having added metastore_id and managed_identity_id I see the same error as described by donatasm.

Provider fails Error: cannot create storage credential: Storage Credential 'bb639fd8-ab52-446e-88d5-ab0169a42473' does not exist. but I can see it in Databricks UI

Screenshot from 2024-01-19 15-41-04

resource "databricks_storage_credential" "datastore_credentials" {
  name          = local.uc_datastore_name
  metastore_id  = var.db_metastore_id
  azure_managed_identity {
    access_connector_id = azurerm_databricks_access_connector.dac_datastore.id
    managed_identity_id = azurerm_user_assigned_identity.uai_dac.id
  }
  owner   = var.metastore_admin_group_display_name
  comment = "Managed by TF"
}

DEBUG

024-01-19T15:32:41.085+0100 [WARN]  Provider "registry.terraform.io/databricks/databricks" produced an invalid plan for module.db_uc_datastore.databricks_storage_credential.datastore_credentials, but we are tolerating it because it is using the legacy plugin SDK.
    The following problems may be the cause of any confusing errors from downstream operations:
      - .databricks_gcp_service_account: attribute representing nested block must not be unknown itself; set nested attribute values to unknown instead
module.db_uc_datastore.databricks_storage_credential.datastore_credentials: Creating...
2024-01-19T15:32:41.085+0100 [INFO]  Starting apply for module.db_uc_datastore.databricks_storage_credential.datastore_credentials
2024-01-19T15:32:41.085+0100 [DEBUG] module.db_uc_datastore.databricks_storage_credential.datastore_credentials: applying the planned Create change
2024-01-19T15:32:41.086+0100 [DEBUG] provider.terraform-provider-databricks_v1.34.0: setting computed for "databricks_gcp_service_account" from ComputedKeys: timestamp="2024-01-19T15:32:41.086+0100"
2024-01-19T15:32:42.037+0100 [DEBUG] provider.terraform-provider-databricks_v1.34.0: POST /api/2.0/accounts/<my databricks account>/metastores/8e482d7b-a035-4d52-b8b1-66610d825ad5/storage-credentials
> {
>   "credential_info": {
>     "azure_managed_identity": {
>       "access_connector_id": "/subscriptions/<my azure subscription>/resourceGroups/pocdbw-rg-databricks-uc-datas... (88 more bytes)",
>       "managed_identity_id": "/subscriptions/<my azure subscription>/resourceGroups/pocdbw-rg-databricks-uc-datas... (108 more bytes)"
>     },
>     "comment": "Managed by TF",
>     "name": "pocdbw-uc-datastore-gerwc"
>   }
> }
< HTTP/2.0 200 OK
< {
<   "credential_info": {
<     "azure_managed_identity": {
<       "access_connector_id": "/subscriptions/<my azure subscription>/resourceGroups/pocdbw-rg-databricks-uc-datas... (88 more bytes)",
<       "credential_id": "1085e0d5-3773-475e-b6e9-e5869cec03b8",
<       "managed_identity_id": "/subscriptions/<my azure subscription>/resourceGroups/pocdbw-rg-databricks-uc-datas... (108 more bytes)"
<     },
<     "comment": "Managed by TF",
<     "created_at": 1705674761742,
<     "created_by": "[email protected]",
<     "full_name": "pocdbw-uc-datastore-gerwc",
<     "id": "bb639fd8-ab52-446e-88d5-ab0169a42473",
<     "isolation_mode": "ISOLATION_MODE_OPEN",
<     "metastore_id": "8e482d7b-a035-4d52-b8b1-66610d825ad5",
<     "name": "pocdbw-uc-datastore-gerwc",
<     "owner": "[email protected]",
<     "read_only": false,
<     "securable_kind": "STORAGE_CREDENTIAL_AZURE_MI",
<     "securable_type": "STORAGE_CREDENTIAL",
<     "updated_at": 1705674761742,
<     "updated_by": "[email protected]"
<   }
< }: tf_req_id=c68605b5-eaa1-dfb2-cd2e-b793f6248a01 tf_resource_type=databricks_storage_credential tf_rpc=ApplyResourceChange @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/logger/logger.go:33 @module=databricks tf_provider_addr=registry.terraform.io/databricks/databricks timestamp="2024-01-19T15:32:42.037+0100"
2024-01-19T15:32:42.947+0100 [DEBUG] provider.terraform-provider-databricks_v1.34.0: non-retriable error: Storage Credential 'bb639fd8-ab52-446e-88d5-ab0169a42473' does not exist.: @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/logger/logger.go:33 @module=databricks tf_provider_addr=registry.terraform.io/databricks/databricks tf_resource_type=databricks_storage_credential tf_rpc=ApplyResourceChange tf_req_id=c68605b5-eaa1-dfb2-cd2e-b793f6248a01 timestamp="2024-01-19T15:32:42.946+0100"
2024-01-19T15:32:42.947+0100 [DEBUG] provider.terraform-provider-databricks_v1.34.0: PUT /api/2.0/accounts/<my databricks account>/metastores/8e482d7b-a035-4d52-b8b1-66610d825ad5/storage-credentials/bb639fd8-ab52-446e-88d5-ab0169a42473
> {
>   "credential_info": {
>     "azure_managed_identity": {
>       "access_connector_id": "/subscriptions/<my azure subscription>/resourceGroups/pocdbw-rg-databricks-uc-datas... (88 more bytes)",
>       "managed_identity_id": "/subscriptions/<my azure subscription>/resourceGroups/pocdbw-rg-databricks-uc-datas... (108 more bytes)"
>     },
>     "comment": "Managed by TF",
>     "owner": "uc-metastore-admins"
>   }
> }
< HTTP/2.0 404 Not Found
< {
<   "details": [
<     {
<       "@type": "type.googleapis.com/google.rpc.RequestInfo",
<       "request_id": "47b2e1b3-928d-441d-a866-c1f97b55906e",
<       "serving_data": ""
<     }
<   ],
<   "error_code": "STORAGE_CREDENTIAL_DOES_NOT_EXIST",
<   "message": "Storage Credential 'bb639fd8-ab52-446e-88d5-ab0169a42473' does not exist."
< }: tf_rpc=ApplyResourceChange @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/logger/logger.go:33 @module=databricks tf_provider_addr=registry.terraform.io/databricks/databricks tf_req_id=c68605b5-eaa1-dfb2-cd2e-b793f6248a01 tf_resource_type=databricks_storage_credential timestamp="2024-01-19T15:32:42.947+0100"
2024-01-19T15:32:42.948+0100 [ERROR] provider.terraform-provider-databricks_v1.34.0: Response contains error diagnostic: tf_proto_version=5.4 tf_req_id=c68605b5-eaa1-dfb2-cd2e-b793f6248a01 tf_resource_type=databricks_storage_credential @module=sdk.proto tf_provider_addr=registry.terraform.io/databricks/databricks tf_rpc=ApplyResourceChange diagnostic_summary="cannot create storage credential: Storage Credential 'bb639fd8-ab52-446e-88d5-ab0169a42473' does not exist." diagnostic_severity=ERROR @caller=/home/runner/work/terraform-provider-databricks/terraform-provider-databricks/vendor/github.com/hashicorp/terraform-plugin-go/tfprotov5/internal/diag/diagnostics.go:62 diagnostic_detail="" timestamp="2024-01-19T15:32:42.948+0100"
2024-01-19T15:32:42.952+0100 [DEBUG] State storage *remote.State declined to persist a state snapshot
2024-01-19T15:32:42.952+0100 [ERROR] vertex "module.db_uc_datastore.databricks_storage_credential.datastore_credentials" error: cannot create storage credential: Storage Credential 'bb639fd8-ab52-446e-88d5-ab0169a42473' does not exist.
2024-01-19T15:32:42.953+0100 [DEBUG] states/remote: state read serial is: 65; serial is: 65
2024-01-19T15:32:42.953+0100 [DEBUG] states/remote: state read lineage is: 417b87b9-c2ac-f29e-abe8-8d71b8c20b39; lineage is: 417b87b9-c2ac-f29e-abe8-8d71b8c20b39
╷
│ Error: cannot create storage credential: Storage Credential 'bb639fd8-ab52-446e-88d5-ab0169a42473' does not exist.
│ 
│   with module.db_uc_datastore.databricks_storage_credential.datastore_credentials,
│   on ../../childmodules/db-uc-datastore/db-uc-datastore-conf.tf line 6, in resource "databricks_storage_credential" "datastore_credentials":
│    6: resource "databricks_storage_credential" "datastore_credentials" {
│ 
╵

olibal avatar Jan 19 '24 14:01 olibal

https://github.com/databricks/terraform-provider-databricks/pull/3184 fixes the issue

donatasm avatar Jan 30 '24 12:01 donatasm