Planned import errors with "Cannot import to non-existent resource address", but plan without import plans to create the resource
Terraform Version
1.5.6
Terraform Configuration Files
module "clarity_pyroscope_server_project" {
source = "./modules/project"
namespace = module.clarity_namespace
name = "pyroscope-server"
# ...
}
import {
to = module.clarity_pyroscope_server_project.gitlab_project_share_group.developers["sec-te-friends"]
id = "2148:2527"
}
import {
to = module.clarity_pyroscope_server_project.gitlab_project_share_group.maintainers["sec-hs-te"]
id = "2148:2524"
}
import {
to = module.clarity_pyroscope_server_project.gitlab_project.main
id = "2148"
}
import {
to = module.clarity_pyroscope_server_project.gitlab_branch_protection.main
id = "2148:develop"
}
import {
to = module.clarity_pyroscope_server_project.gitlab_tag_protection.main
id = "2148:*"
}
import {
to = module.clarity_pyroscope_server_project.gitlab_project_variable.docker_auth_config[0]
id = "2148:DOCKER_AUTH_CONFIG:*"
}
import {
to = module.clarity_pyroscope_server_project.gitlab_project_variable.artifactory_user["playground"]
id = "2148:PLAYGROUND_USER:*"
}
import {
to = module.clarity_pyroscope_server_project.gitlab_project_variable.artifactory_token["playground"]
id = "2148:PLAYGROUND_TOKEN:*"
}
import {
to = module.clarity_pyroscope_server_project.gitlab_project_variable.artifactory_user["release"]
id = "2148:RELEASE_USER:*"
}
import {
to = module.clarity_pyroscope_server_project.gitlab_project_variable.artifactory_token["release"]
id = "2148:RELEASE_TOKEN:*"
}
import {
to = module.clarity_pyroscope_server_project.gitlab_project_variable.artifactory_user["snapshot"]
id = "2148:SNAPSHOT_USER:*"
}
import {
to = module.clarity_pyroscope_server_project.gitlab_project_variable.artifactory_token["snapshot"]
id = "2148:SNAPSHOT_TOKEN:*"
}
Debug Output
Happy to provide a redacted version upon request, but I couldn't find anything relevant for the imports.
Expected Behavior
The terraform plan/apply succeeds, the resources specified in the import blocks are imported successfully.
Actual Behavior
I get errors about non-existent resource addresses, but not for any resource without for_each/count. Example:
╷
│ Error: Cannot import to non-existent resource address
│
│ Importing to resource address
│ module.clarity_pyroscope_server_project.gitlab_project_share_group.developers["sec-te-friends"]
│ is not possible, because that address does not exist in configuration.
│ Please ensure that the resource key is correct, or remove this import
│ block.
╵
However, a terraform plan without the import blocks shows me that all these resources are planned to be created.
To be imported
|
Failed to import
|
Created by plan without imports
|
In reality, many more module and import block are automatically generated (~200 modules, ~7000 imports), so far I've randomly picked several addresses that the import block complains about but they are all in the list of resources that would be created by a plan without imports.
Steps to Reproduce
I have not yet attempted to find a minimally reproducible example.
Additional Context
No response
References
No response
I diffed the full list of addresses that the Terraform plan was going to create vs. the list of resources I have import blocks for, and there is no resource address in the imports that wouldn't also get created by a naked plan.
Hi @NiklasRosenstein,
Thanks for filing the issue! Given where this validation occurs, I don't yet see how we could arrive there without the given resource being planned during a normal plan, but not when there's an import block. Just to confirm I also did some stress and race tests with thousands of resources and modules and still came up with no errors.
Is it possible that the general configuration was generated differently when the imports are not present? Are you running these from the same working directory and just removing a file with import blocks, or are the working directories different?
If you are able to narrow this down to a more minimal example and provide the logs it would help greatly.
Thanks!
@jbardin I am also experiencing the same error. Below is a simplified simulation example.
# main.tf
resource "random_uuid" "test1" {
count = 2
}
resource "random_uuid" "test2" {
}
import {
to = random_uuid.test1[0]
id = "D9B77F59-1C26-451F-8A9C-DF291925878F"
}
import {
to = random_uuid.test1[1]
id = "F6C3BB71-548F-4BDC-976C-D7B8F23806FC"
}
import {
to = random_uuid.test2
id = "4CF1985A-4EE7-4B4A-AA6F-867282C73AE9"
}
On the above file when i run terraform plan -target=random_uuid.test2 it errors out with "Cannot import to non-existent resource address"
$ terraform plan -target=random_uuid.test2
random_uuid.test2: Preparing import... [id=4CF1985A-4EE7-4B4A-AA6F-867282C73AE9]
random_uuid.test2: Refreshing state... [id=4cf1985a-4ee7-4b4a-aa6f-867282c73ae9]
╷
│ Warning: Resource targeting is in effect
│
│ You are creating a plan with the -target option, which means that the result of this plan may not represent all of the changes requested by the current configuration.
│
│ The -target option is not for routine use, and is provided only for exceptional situations such as recovering from errors or mistakes, or when Terraform specifically suggests to use it as part
│ of an error message.
╵
╷
│ Error: Cannot import to non-existent resource address
│
│ Importing to resource address random_uuid.test1[0] is not possible, because that address does not exist in configuration. Please ensure that the resource key is correct, or remove this import
│ block.
╵
╷
│ Error: Cannot import to non-existent resource address
│
│ Importing to resource address random_uuid.test1[1] is not possible, because that address does not exist in configuration. Please ensure that the resource key is correct, or remove this import
│ block.
When i run terraform plan -target=random_uuid.test1 there are no errors
$ terraform plan -target=random_uuid.test1
random_uuid.test1[1]: Preparing import... [id=F6C3BB71-548F-4BDC-976C-D7B8F23806FC]
random_uuid.test1[0]: Preparing import... [id=D9B77F59-1C26-451F-8A9C-DF291925878F]
random_uuid.test1[0]: Refreshing state... [id=d9b77f59-1c26-451f-8a9c-df291925878f]
random_uuid.test1[1]: Refreshing state... [id=f6c3bb71-548f-4bdc-976c-d7b8f23806fc]
Terraform used the selected providers to generate the following execution plan. Resource actions are indicated with the following symbols:
~ update in-place
Terraform will perform the following actions:
# random_uuid.test1[0] will be updated in-place
# (imported from "D9B77F59-1C26-451F-8A9C-DF291925878F")
~ resource "random_uuid" "test1" {
id = "d9b77f59-1c26-451f-8a9c-df291925878f"
- keepers = {} -> null
result = "d9b77f59-1c26-451f-8a9c-df291925878f"
}
# random_uuid.test1[1] will be updated in-place
# (imported from "F6C3BB71-548F-4BDC-976C-D7B8F23806FC")
~ resource "random_uuid" "test1" {
id = "f6c3bb71-548f-4bdc-976c-d7b8f23806fc"
- keepers = {} -> null
result = "f6c3bb71-548f-4bdc-976c-d7b8f23806fc"
}
Plan: 2 to import, 0 to add, 2 to change, 0 to destroy.
╷
│ Warning: Resource targeting is in effect
│
│ You are creating a plan with the -target option, which means that the result of this plan may not represent all of the changes requested by the current configuration.
│
│ The -target option is not for routine use, and is provided only for exceptional situations such as recovering from errors or mistakes, or when Terraform specifically suggests to use it as part
│ of an error message.
╵
───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
Note: You didn't use the -out option to save this plan, so Terraform can't guarantee to take exactly these actions if you run "terraform apply" now.
When i run terraform plan also there are no errors
$ terraform plan
random_uuid.test1[0]: Preparing import... [id=D9B77F59-1C26-451F-8A9C-DF291925878F]
random_uuid.test1[1]: Preparing import... [id=F6C3BB71-548F-4BDC-976C-D7B8F23806FC]
random_uuid.test2: Preparing import... [id=4CF1985A-4EE7-4B4A-AA6F-867282C73AE9]
random_uuid.test1[0]: Refreshing state... [id=d9b77f59-1c26-451f-8a9c-df291925878f]
random_uuid.test1[1]: Refreshing state... [id=f6c3bb71-548f-4bdc-976c-d7b8f23806fc]
random_uuid.test2: Refreshing state... [id=4cf1985a-4ee7-4b4a-aa6f-867282c73ae9]
Terraform used the selected providers to generate the following execution plan. Resource actions are indicated with the following symbols:
~ update in-place
Terraform will perform the following actions:
# random_uuid.test1[0] will be updated in-place
# (imported from "D9B77F59-1C26-451F-8A9C-DF291925878F")
~ resource "random_uuid" "test1" {
id = "d9b77f59-1c26-451f-8a9c-df291925878f"
- keepers = {} -> null
result = "d9b77f59-1c26-451f-8a9c-df291925878f"
}
# random_uuid.test1[1] will be updated in-place
# (imported from "F6C3BB71-548F-4BDC-976C-D7B8F23806FC")
~ resource "random_uuid" "test1" {
id = "f6c3bb71-548f-4bdc-976c-d7b8f23806fc"
- keepers = {} -> null
result = "f6c3bb71-548f-4bdc-976c-d7b8f23806fc"
}
# random_uuid.test2 will be updated in-place
# (imported from "4CF1985A-4EE7-4B4A-AA6F-867282C73AE9")
~ resource "random_uuid" "test2" {
id = "4cf1985a-4ee7-4b4a-aa6f-867282c73ae9"
- keepers = {} -> null
result = "4cf1985a-4ee7-4b4a-aa6f-867282c73ae9"
}
Plan: 3 to import, 0 to add, 3 to change, 0 to destroy.
───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
Note: You didn't use the -out option to save this plan, so Terraform can't guarantee to take exactly these actions if you run "terraform apply" now.
@apparentlymart The above reproductions steps i have posted was in terraform version v1.6.4
I am also facing this issue any workaround?
There is a significant likelihood that this issue was fixed in 1.7 with the improvements / fixes to the foreach functionality. If anyone is seeing this issue in 1.7, please let us know and I will re-open this issue
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.