terraform-cdk
terraform-cdk copied to clipboard
go and azurerm out of memory with cdktf get/synth
Community Note
- Please vote on this issue by adding a π reaction to the original issue to help the community and maintainers prioritize this request
- Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
- If you are interested in working on this issue or have submitted a pull request, please leave a comment
cdktf & Language Versions
16GB Windows laptop 8GB WSL2 ubuntu instance + 2GB swap go 1.8.3 node 18.4.0 Terraform v1.2.3 cdktf 0.11.2 "terraformProviders": ["hashicorp/azurerm@~> 3.0.0"],
Affected Resource(s)
azurerm provider
Debug Output
Expected Behavior
Fast and successful compile for cdktf get and cdktf synth
Actual Behavior
Behaviour is similar to the links below. Out of memory, WSL's 10GB is exhausted. Time to execute is minutes with 100% CPU.
Also system requirements are unclear from https://www.terraform.io/cdktf
I found in a bug "azurerm provider needs ~13 GBs of memory to get generated", but closed
Appreciate Go is experimental, but looking through the bugs it looks like a JSII node/problem?
Worth mentioning that gopls is also crashing frequently when editing the project in vscode.
Currently usable for me unfortunately
Steps to Reproduce
clean install - calls to azurerm provider for servicebus, storage accounts, eventgrid
Important Factoids
n/a
References
fix(lib): Increased --max-old-space-size value #1265 Generating azurerm provider for go exceeds 'hard' memory limit #1264 https://github.com/aws/jsii/issues/3091 Generating template for GCP provider with TypeScript consistently leads to V8 module hitting OOM error #1885
- #0000
The problem that we are facing is that we generate quite a bit of TS code for JSII to translate into all other languages and JSII has some performance problems. This problem periodically gets worse since we add more features to CDKTF since new features mean new code so more strain on JSII. I totally agree with you that this behaviour is unacceptable, just want to share some context :)
@adeturner I inherited a similar setup utilizing go+azurerm. Currently stuck at version 0.5.0 of cdktf because newer versions take way too much memory and cpu to compile. Even this old version takes around 6-7GB to compile with each newer version making the problem exponentially worse(my laptop froze for a couple of hours after I upgraded to 0.11). CDKTF with go is borderline unusable at this point :(
Hi π I just wanted to quickly plug that we have pre-built providers for Go since CDKTF v0.12. So, if you don't have a specific requirement for a certain Azure provider provider version, switching to pre-built providers might do the trick until we get the underlying memory consumption issues fixed.
To clarify, @htonkovac: Do you need 6-7 GB of memory to run cdktf get
or to run cdktf synth
?
Hello,
Iβm using cdktf 0.5 with go to deploy a single azure resource.
Running β/bin/time cdktf synthβ takes around 0.6GB. However, this uses a cache. Consider what happens in a CI/CD system. The code is compiled for the first time and there is no cache.
So to simulate a run with no cache I run β/bin/time go build -aβ. And this takes 5.7GB of memory!! In newer versions of cdktf this problem only got worse.(Iβm stuck at v0.5)
The azurerm.go file that is in my generated folder has 800 000 lines of code, newer versions of cdktf had double that!
On Tue, 2 Aug 2022 at 14:42, Ansgar Mertens @.***> wrote:
Hi π I just wanted to quickly plug that we have pre-built providers for Go since CDKTF v0.12. So, if you don't have a specific requirement for a certain Azure provider provider version, switching to pre-built providers might do the trick until we get the underlying memory consumption issues fixed.
To clarify, @htonkovac https://github.com/htonkovac: Do you need 6-7 GB of memory to run cdktf get or to run cdktf synth?
β Reply to this email directly, view it on GitHub https://github.com/hashicorp/terraform-cdk/issues/1886#issuecomment-1202452432, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFNZQBO5BGZTYHM7N6NMV53VXEJTTANCNFSM5ZI7WNVA . You are receiving this because you were mentioned.Message ID: @.***>
-- Hrvoje Tonkovac
we previously gave cdktf (in golang) a try and we were scared away by its poor performance https://dev.to/michael_lin/deploy-infrastructure-using-cdk-for-terraform-with-go-28ne
recently, we saw cdktf went GA and decided to give it another run
version: 0.12.3
system: M1 Max
-
cdktf get
is significantly faster π -
go build
time is acceptable even with a lot of providers π -
go build
time is not acceptable after adding some external modules written in HCL (e.g. https://github.com/terraform-google-modules/terraform-google-sql-db). It kept on running for a few minutes and I gave up eventually. π
conclusion: cdktf (in go) is still pretty unusable in go
Hi π We recently released CDKTF 0.13 which contains some major performance improvements for Go: Head over to https://cdk.tf/0.13 to learn more about this recent change!
While this should improve provider related performance issues (as initially raised in this issue), if there are special cases (e.g. with special module configurations or similar), don't hesitate to file a new issue and we'll try to reproduce and improve those cases as well π΅οΈ
Hi π We recently released CDKTF 0.13 which contains some major performance improvements for Go: Head over to https://cdk.tf/0.13 to learn more about this recent change!
While this should improve provider related performance issues (as initially raised in this issue), if there are special cases (e.g. with special module configurations or similar), don't hesitate to file a new issue and we'll try to reproduce and improve those cases as well π΅οΈ
Hey, we are seeing great improvement in Go π
- fresh build without any build cache: 8 min -> 6 sec
- subsequent build with cached (no change to auto-gen code): 8 sec -> 2 sec
thank you ;)
I'm going to lock this issue because it has been closed for 30 days. This helps our maintainers find and focus on the active issues. If you've found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.