Regression performance compiling types in version 5.4.2
π Search Terms
"regression", "5.4.2", "performance"
π Version & Regression Information
- This changed between versions 5.3.3 and 5.4.2
β― Playground Link
No response
π» Code
Extended diagnostics in version 5.4.2
Files: 4294
Lines of Library: 40640
Lines of Definitions: 180807
Lines of TypeScript: 430513
Lines of JavaScript: 0
Lines of JSON: 0
Lines of Other: 0
Identifiers: 591567
Symbols: 1232510
Types: 4677381
Instantiations: 395496734
Memory used: 2330550K
Assignability cache size: 3974893
Identity cache size: 14482
Subtype cache size: 37794
Strict subtype cache size: 9374
I/O Read time: 0.70s
Parse time: 0.96s
ResolveModule time: 0.58s
ResolveTypeReference time: 0.01s
ResolveLibrary time: 0.01s
Program time: 2.51s
Bind time: 0.62s
Check time: 367.45s
printTime time: 0.00s
Emit time: 0.00s
Total time: 370.58s
Extended diagnostics in version 5.3.3
Files: 4290
Lines of Library: 40241
Lines of Definitions: 180807
Lines of TypeScript: 430513
Lines of JavaScript: 0
Lines of JSON: 0
Lines of Other: 0
Identifiers: 591320
Symbols: 1234773
Types: 699200
Instantiations: 2247099
Memory used: 1308962K
Assignability cache size: 223823
Identity cache size: 14479
Subtype cache size: 28353
Strict subtype cache size: 9128
I/O Read time: 0.73s
Parse time: 1.11s
ResolveModule time: 0.44s
ResolveTypeReference time: 0.01s
ResolveLibrary time: 0.02s
Program time: 2.60s
Bind time: 0.76s
Check time: 13.75s
printTime time: 0.00s
Emit time: 0.00s
Total time: 17.11s
π Actual behavior
The type validation using tsc --noEmit increase more than 20 times using the version 5.4.2 comparing with the version 5.3.3.
In the CI is always hitting the heap memory limit:
π Expected behavior
I expected the compilation to be equivalent in both versions, but it was more than 20 times slower.
Additional information about the issue
No response
Could u share a repro case of this problem?
The number of instantiations went up over a hundredfold π±
If you can reproduce this consistently, consider bisecting using https://www.npmjs.com/package/every-ts.
If you can reproduce this consistently, consider bisecting using https://www.npmjs.com/package/every-ts.
I was able to pinpoint the exact commit which the performance problem was started.
8d1fa440dd5ad547e836abcca45ccc94e81f1fe2 is the first bad commit commit 8d1fa440dd5ad547e836abcca45ccc94e81f1fe2 Author: Mateusz BurzyΕski [email protected] Date: Thu Nov 30 20:24:04 2023 +0100
Defer processing of nested generic calls that return constructor types (#54813)
Co-authored-by: Nathan Shively-Sanders <[email protected]>
Could u share a repro case of this problem?
Unfortunately, I cannot share a repro for this problem as this is an internal closed-source project from my company.
@Andarist Itβs one of your PRs: https://github.com/microsoft/TypeScript/pull/54813
Unfortunately, I cannot share a repro for this problem as this is an internal closed-source project from my company.
Understandably, u might not be able to share it as-is. To investigate this I'd need some kind of a repro case though - I can't chase a wild goose
Understandably, u might not be able to share it as-is. To investigate this I'd need some kind of a repro case though - I can't chase a wild goose
Sure thing, I will work on creating a reproducible for this issue. It may take a while as our codebase is quite large π
I sent a PR to revert the change, but it's likely that the underlying bug was unrelated, and exposed by this change. @weswigham believes it is possible to construct a failure that relies exclusively on functions, not on constructors. Long-term, we should isolate and fix the underlying bug and re-apply this change.
@georgekaran would one of the TS team at Microsoft be able to sign an NDA to get access to your code? Then we could isolate the problem and produce a repro that's not linked to your code.
Even if you can't share the code for now, can you try running tsc via https://www.npmjs.com/package/pprof-it? The profile won't include any code, and if you enable PPROF_SANITIZE=true, shouldn't include any paths or other info from your machine. (But I would verify the output file.)
I just want to chime in and say that I think I'm encountering the same issue, however updating to 5.4.3 hasn't fixed the problem. I'm still seeing the error:
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
I'm actually seeing this when running ESLint, but I've been able to track it down to something related to the TypeScript config that's being used. Using the same config that worked fine with TS 5.3 will cause the OOM errors in TS 5.4. Changing that config to exclude certain directories (and hence reducing the overall size of the project) avoids the OOM errors.
Unfortunately, I also cannot share a repro for the same reasons (closed-source, company project, etc), and there's next to no chance of an NDA being allowed. I'll see what I can do to get a simpler reproduction though.
I would definitely recommend using every-ts and linking it in to bisect, if it's reproducible. But I would doubt that your issue is related; typically bugs like this aren't, but a bisect would say .
@jakebailey Looks like you're right about what I'm seeing being a different bug. Commit 8d1fa440dd5ad547e836abcca45ccc94e81f1fe2 doesn't exhibit the behavior that I'm seeing.
every-ts is awesome by the way! β€οΈ
~~I had a bit of trouble using every-ts bisect to pinpoint when the bug that I'm seeing was introduced. Bisecting 5.3.3 and 5.4.2 led me to the problem commit being one that only changed files in the .github/ folder π. I have a feeling that might be because the bug was introduced at some point, then fixed or reverted, then introduced again.~~
Edit: I must have made a mistake on multiple bisect runs, because I just tried again and ended up with the correct commit that's causing my error. π€¦ββοΈ
I was able to successfully use every-ts bisect to pinpoint the commit where the bug I'm seeing was introduced. π I'll create a new issue about it.
Edit: #58011 is the new issue.
Sorry, I commented on a couple of other old issues before I read some comments about trying to group things by at least versions (in my defense, there's a lot of OOM issues open π )
--
I'm running into the same on GitHub Actions Runner (ubuntu-latest) but not on local (M1 Pro).
Please ask for more info that might be helpful and I'll examine if there's something I'm at liberty to share.
Some basic relative info:
Next App, large-ish project.
Running into this error on [email protected] but working on [email protected] in Github Actions, always works on local.
using "zod": "^3.20.3"(seemed relative to another OOM issue I found)
Sorry, I commented on a couple of other old issues before I read some comments about trying to group things by at least versions (in my defense, there's a lot of OOM issues open π )
Not sure what you mean; we want new issues for different OOMs because they're very likely unrelated, unless you've bisected the problem to the same thing as this particular issue.
unless you've bisected the problem to the same thing as this particular issue.
Right, gotcha, well that's kinda difficult in my case as it's only happening on GH Actions :/
What would be the best way to try and spot it?
Still testing manually on local (with every-ts) and checking for a memory spike?
And hoping that it's platform (Mac vs. Ubuntu) independent?
(thx for the quick reply btw!)
--extendedDiagnostics should show memory usage, or you could artificially lower the max memory with Node's max-old-space-size. That or provide a repro.
Just wanted to give my two cents as I'm also experiencing the issue and hoping it will help in resolving the issue.
Following the guidelines here, I generated traces for running npx tsc --noEmit when running the build locally and through docker (which is how the project is built in the CI env). The oom error only occurs in CI as well.
The local build does not run oom. Both the local and the docker traces however reveal large recursive trace events for checkSourceFile related to types in the @octokit/types package that includes very large union types. The events mention the traceUnionsOrIntersectionsTooLarge_DepthLimit title. This trace makes sense given that the process runs out of memory but only when building via docker which has a 7.9GB memory limit. OP might want to see if this is the same underlying issue. Adding NODE_OPTIONS="--max-old-space-size=4096" to the tsc command fixes the issue but isn't a viable long-term solution. Increasing the memory limit in Docker Desktop to 16GB also resolves the error.
I'm running this with [email protected]. Unfortunately, I also can't share the code due to it being closed source.
I believe that I've run against this same issue while in the process of migrating a repository from 5.0.3 to 5.9.3. Working backwards from an OOM error, I've found that there's a ~10% increase in memory required to run the tsc command when upgrading past version 5.3.3.
I've isolated the issue to release 5.4.0-dev.20231130 which I believe has already been mentioned earlier in this thread. We happen to make use of two separate configs for our test and application files. This has led me to discover that the memory increase only appears when test files are being processed.
With that, I've managed to put together a repo where the error can be reproduced. The repo contains ~500 dummy test files, jest and testing-library. Let me know if there's any further information I can provide, I'd be very happy if there's some way to improve things!
https://github.com/lokalise-mark/typescript-performance-regression-repro-example