terraform-cdk
terraform-cdk copied to clipboard
Multistack-cross-stack-refference-error: bug?
Expected Behavior
I expect stack A to be independant of stack B since no dependencies are defined.
Actual Behavior
When using multistacks. I get an error on cross-referencing. Where the top stack is apparently dependent on the second stack, even when it should not be.
update:
Both the stacks sets up their own External Provider which used External Data. This is what caused the issue. Don't know if it's expected behaviour, though. See comment.
const app = new App();
new app(app, "app-dev", devConfig);
new apiStack(app, "app-api-dev", devApiConfig);
app.synth();
[2024-02-24T02:06:22.792] [ERROR] default - ╷
│ Error: Unable to find remote state
│
│ with data.terraform_remote_state.cross-stack-reference-input-app-api-dev,
│ on cdk.tf.json line 64, in data.terraform_remote_state.cross-stack-reference-input-app-api-dev:
│ 64: "workspace": "${terraform.workspace}"
│
│ No stored state was found for the given workspace in the given backend.
app-dev ╷
│ Error: Unable to find remote state
│
│ with data.terraform_remote_state.cross-stack-reference-input-app-api-dev,
│ on cdk.tf.json line 64, in data.terraform_remote_state.cross-stack-reference-input-app-api-dev:
│ 64: "workspace": "${terraform.workspace}"
│
│ No stored state was found for the given workspace in the given backend.
Steps to Reproduce
- create two stacks - with their own state keys.
const app = new App();
new cargostream(app, "app-dev", devConfig);
new apiStack(app, "app-api-dev", devApiConfig);
app.synth();
-
Run cdktf plan "app-dev" - gives the unable to find remote state error.
-
Run cdktf plan "app-api-dev" No error
Versions
language: typescript cdktf-cli: 0.20.3 node: v20.11.0 cdktf: 0.20.2 constructs: 10.3.0 jsii: null terraform: 1.7.2 arch: arm64 os: darwin 22.5.0 providers external@~>2.3.2 (LOCAL) terraform provider version: 2.3.3 @cdktf/provider-aws (PREBUILT) terraform provider version: 5.33.0 prebuilt provider version: 19.2.0 cdktf version: ^0.20.0
Providers
┌───────────────┬──────────────────┬─────────┬────────────┬─────────────────────┬─────────────────┐ │ Provider Name │ Provider Version │ CDKTF │ Constraint │ Package Name │ Package Version │ ├───────────────┼──────────────────┼─────────┼────────────┼─────────────────────┼─────────────────┤ │ external │ 2.3.3 │ │ ~>2.3.2 │ │ │ ├───────────────┼──────────────────┼─────────┼────────────┼─────────────────────┼─────────────────┤ │ aws │ 5.33.0 │ ^0.20.0 │ │ @cdktf/provider-aws │ 19.2.0 │ └───────────────┴──────────────────┴─────────┴────────────┴─────────────────────┴─────────────────┘
Gist
https://gist.github.com/Dill-Dall/f8214e40fc6bba1fe71763d804d6a2e3
Possible Solutions
No response
Workarounds
Removed the External providers (the outputs were cross reffed it seems. See comment)
Anything Else?
First timer on the repo.
References
https://discuss.hashicorp.com/t/cdk-multistacking-workspace-cross-reff-error/63108
Help Wanted
- [ ] I'm interested in contributing a fix myself
Community Note
- Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
- Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
- If you are interested in working on this issue or have submitted a pull request, please leave a comment
Update: I found the culprit, it was my use of the ExternalProvider in the stacks. But don’t understand why. The outputs of the external provider created the issue.
export function setupProvidersAndStateBackend(scope: Construct, envConfig: EnvConfig, tags: Tags): [AwsProvider, ExternalProvider] {
validateEnvConfig(envConfig);
const externalProvider = new ExternalProvider(scope, "external", {});
const terraformVersion = new DataExternal(scope, "terraform_version", {
program: ["bash", "-c", "echo '{\"version\": \"\'$(terraform version | grep 'Terraform v' | cut -d ' ' -f 2)\'\"}'"],
});
const cdktfVersion = new DataExternal(scope, "cdktf_version", {
program: ["bash", "-c", "echo '{\"version\": \"\'$(cdktf --version)\'\"}'"],
});
tags.terraform_version = terraformVersion.result.lookup("version") as string;
tags.cdkf_version = cdktfVersion.result.lookup("version") as string;
....
In general this should / can only happen if a token created in one stack is being used in another one. The example is too small / not complete enough to understand where the issue is coming from. There must be an implicit dependency between the two where e.g. the result from this function is used in both or sth similar. It would be helpful if you could share a complete example (ideally a minimal one).
thx for the feedback. I found the issue, all on my end. The reason was because I changed the "tags" constant. Meaning it made a cross stack reff for when multiple stacks used the same "tags" const. Fixed it by creating a copy instead.
export function setupProvidersAndStateBackend(scope: Construct, envConfig: EnvConfig, tags: Tags, alignWithLegacy: boolean = false): ProviderStateConfig {
validateEnvConfig(envConfig);
const configGeneratedTags: { [key: string]: any } = {};
const externalProvider = new ExternalProvider(scope, "external", {});
if (alignWithLegacy == false) {
const terraformVersion = new DataExternal(scope, "terraform_version", {
program: ["bash", "-c", "echo '{\"version\": \"\'$(terraform version | grep 'Terraform v' | cut -d ' ' -f 2)\'\"}'"],
});
const cdktfVersion = new DataExternal(scope, "cdktf_version", {
program: ["bash", "-c", "echo '{\"version\": \"\'$(cdktf --version)\'\"}'"],
});
configGeneratedTags['terraform_version'] = terraformVersion.result.lookup("version");
configGeneratedTags['cdktf_version'] = cdktfVersion.result.lookup("version");
}
const awsProvider = new AwsProvider(scope, "AWS", {
region: envConfig.region || "eu-north-1",
defaultTags: [
{
tags: {
...tags,
...configGeneratedTags,
realm: envConfig.env,
},
},
],
....
My own bad code, not a bug 👍
I'm going to lock this issue because it has been closed for 30 days. This helps our maintainers find and focus on the active issues. If you've found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.