bicep
bicep copied to clipboard
System.Collections.Generic.KeyNotFoundException error deploying template spec module in third-level nested module on MSFT-hosted DevOps Agents
Bicep version Bicep CLI version 0.5.6 (5f2f88f0f0)
Describe the bug
I have a top-level bicep template, main.bicep
which is used within az deployment group create
to deploy the resources.
The specific resource that I'm hitting this issue contains 2 arrays, which I want to iterate through, nested iteration - e.g. I supply multiple resource groups and for each provided loop through multiple provided role ids to apply all of the roles to all of the resource groups. This is done within a loop of the resources, so it's 3-levels of iteration.
In order to do this nested looping, this main template then calls another bicep module, nestedmod1.bicep
, which in turn calls another bicep module, nestedmod2.bicep
.
This third-level nested module uses a template spec via a module declaration, but in this setup, when it runs on a Microsoft-hosted DevOps Agent it fails with the following error:
ERROR: Unhandled exception. System.Collections.Generic.KeyNotFoundException: The given key 'Bicep.Core.Syntax.ModuleDeclarationSyntax' was not present in the dictionary.
at System.Collections.Immutable.ImmutableDictionary`2.get_Item(TKey )
at Bicep.Core.Workspaces.SourceFileGrouping.LookUpModuleSourceFile(ModuleDeclarationSyntax moduleDeclaration)
at Bicep.Core.Semantics.ModuleSymbol.TryGetSemanticModel(ISemanticModel& semanticModel, ErrorDiagnostic& failureDiagnostic)
at Bicep.Core.TypeSystem.DeclaredTypeManager.GetDeclaredModuleType(ModuleDeclarationSyntax module)
at Bicep.Core.TypeSystem.DeclaredTypeManager.GetModuleType(ModuleDeclarationSyntax syntax)
...
I can provide the full error if it would help.
Strangely, calling the same az deployment group create
(or what-if) command, with the same above setup, works from my laptop, but always seems to cause the above error on MSFT-hosted Windows agents, via the template deployment. From what I can see with the same versions of az (2.34.1) and bicep CLI (0.5.6).
If I change the final module (nestedmod2.bicep
) to deploy the resource directly, rather than using the template spec it works. Also calling the template spec via it's full reference (without the alias) it still fails, as it does when I use a very simplified call within that final module to a template spec - one without loops etc. Putting that simple template spec code up into the second-level bicep module doesn't error, but for my scenario I think I need that extra level for the nested looping I am trying to implement.
To Reproduce I've attached a small redacted reproducer including copies of the template specs that are targeted in the deployment. Please let me know if it isn't clear what I am trying to do.
Additional context To be honest I'm not sure if this could be a bug in the bicep CLI at all, so apologies if this isn't the relevant project to post this against - it was my initial thought to do so. So this is as much a question, as I may be doing something wrong, or it isn't supported what I am trying to do.
Hi @roggypro
I've been able to reproduce the error with the code you provided in Azure DevOps, thank you for that.
It is really a bug with Bicep. I don't know exactly why it happen yet, debug info is limited and we'll need to dive deeper into this.
The good news, you've identified a workaround.
Thanks @slapointe, appreciate you taking the time to look into this, reproduce it and confirm it is a bug.
Please let me know if you need any more info from me - as you say, we do have the workaround to deploy the resource directly within the bottom-level bicep template, and that works fine - we are keen to use template specs where possible in order to standardise, but it isn't holding us up with the workaround.
Many thanks, Rog
Also calling the template spec via it's full reference (without the alias) it still fails,
This is interesting - so only on this DevOps agent and only when using registry aliases.
Apologies if I'm misunderstanding, but I think my point was that it fails with the same error whether using a Template Spec alias or the full Template Spec path - i.e. doesn't seem to be related to the alias itself - from my testing at least.
Hope this helps. Thanks
Also been stung by this, would be good to find a workaround or have an update on how diagnosing is going please :)
We are having this problem with also with Azure Registry
I have a similar issue, found that Windows pwsh
works fine, WSL Ubuntu pwsh
fails, similar to my ubuntu-latest
hosted pipeline agent. Also tried the windows-latest
hosted agent with the same failure, so I am not sure what the actual problem is.
Update: I was able to repo this locally by removing the ~/.bicep/ts
local cache and then running the what-if
command. If I rerun the command after the failure, it works. I can repeat the failure by removing the local cache.
i am also hitting this issue when building on Ubuntu (Azure DevOps) but not on Windows (Azure DevOps) or local (pwsh or wsl).
@alex-frankel , here is an internal link to where I had to remove use of the bicep registry. Let me know if I can do anything to help debug. (We could create a new branch from the pre-fix to repro) https://msazure.visualstudio.com/One/_git/Azure-Gaming/commit/2062694455a6b7cb4ba17779e04c4187bca7a5a3?refName=refs%2Fheads%2Fdciborow%2Fdeploy-arm
@shenglol , FYI
I have a similar issue, found that Windows
pwsh
works fine, WSL Ubuntupwsh
fails, similar to myubuntu-latest
hosted pipeline agent. Also tried thewindows-latest
hosted agent with the same failure, so I am not sure what the actual problem is.Update: I was able to repo this locally by removing the
~/.bicep/ts
local cache and then running thewhat-if
command. If I rerun the command after the failure, it works. I can repeat the failure by removing the local cache.
I can confirm that when you try a second attempt in your build pipeline, directly after the failed attempt, the deployment does work! I'm using vmImage: ubuntu-latest
I found the root cause of the bug. What happens is that if a module contains a br
/ts
reference, after the referenced is restored, we just rebuild the parent module containing that module, but we don't refresh all the ancestors including the root module.