Many files in server assets fails to build the app with error: JavaScript heap out of memory
Environment
Nuxt project info: 4:49:34 pm
- Operating System: Windows_NT
- Node Version: v20.8.0
- Nuxt Version: 3.7.4
- CLI Version: 3.9.0
- Nitro Version: 2.6.3
- Package Manager: [email protected]
- Builder: -
- User Config: css, devtools, experimental, nitro, modules, postcss, pwa, routeRules, runtimeConfig
- Runtime Modules: @vueuse/[email protected], @bg-dev/[email protected], @pinia/[email protected], @pinia-plugin-persistedstate/[email protected], @vite-pwa/[email protected]
- Build Modules: -
Reproduction
- I have a Nuxt 3 app.
- I added 3500 JSON files of 300 KB each in
/server/assetsfolder
Describe the bug
No issues in dev (npm run dev). The app starts up properly and runs properly.
But on npm run build, the terminal shows FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory. This error is shown after ℹ Building Nitro Server (preset: node-server).
If I remove these 3500 JSON files, the error is not there. Also, I tried keeping just 1 file; the error goes away. So, the error exists only when there are too many files.
Additional context
This error exists even after trying different presets, i.e., firebase or the default node-server.
I think it would be better to add some wait time after this line so as to not cause memory error.
https://github.com/unjs/nitro/blob/be6bb7ea136c464ab28eb8662eb630e11c7b0b68/src/rollup/plugins/server-assets.ts#L50
Logs
✔ Server built in 15417ms 4:28:26 pm
✔ Generated public .output/public nitro 4:28:26 pm
PWA v0.16.5
mode generateSW
precache 86 entries (1942.23 KiB)
files generated
.output\public\sw.js
.output\public\workbox-fa446783.js
ℹ Building Nitro Server (preset: node-server) nitro 4:28:32 pm
<--- Last few GCs --->
[14268:000001B7FCD02090] 191006 ms: Mark-Compact (reduce) 2045.3 (2087.0) -> 2044.6 (2087.0) MB, 481.63 / 0.07 ms (+ 406.2 ms in 97 steps since start of marking, biggest step 20.5 ms, walltime since start of marking 1103 ms) (average mu = 0.651, curren[14268:000001B7FCD02090] 192583 ms: Mark-Compact (reduce) 2046.2 (2087.3) -> 2045.1 (2087.6) MB, 1556.42 / 0.01 ms (average mu = 0.395, current mu = 0.013) allocation failure; scavenge might not succeed
<--- JS stacktrace --->
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
1: 00007FF6A2303CEF node::SetCppgcReference+15695
2: 00007FF6A227E606 EVP_MD_meth_get_input_blocksize+78566
3: 00007FF6A22803F1 EVP_MD_meth_get_input_blocksize+86225
4: 00007FF6A2CEA191 v8::Isolate::ReportExternalAllocationLimitReached+65
5: 00007FF6A2CD3928 v8::Function::Experimental_IsNopFunction+1336
6: 00007FF6A2B35190 v8::Platform::SystemClockTimeMillis+659552
7: 00007FF6A2B32218 v8::Platform::SystemClockTimeMillis+647400
8: 00007FF6A2B4752A v8::Platform::SystemClockTimeMillis+734202
9: 00007FF6A2B47DA7 v8::Platform::SystemClockTimeMillis+736375
10: 00007FF6A2B566CF v8::Platform::SystemClockTimeMillis+796063
11: 00007FF6A2816C95 v8::CodeEvent::GetFunctionName+116773
12: 00007FF642D5AAF
Any chance you can provide a runnable reproduction? 🙏🏼
Sure, it will take some time to upload all these files.
Edit: Here is the repo to test --> https://github.com/ManasMadrecha/nuxt-nitro-server-assets-many
Let me try to run npm run build on stackblitz. I tried with 200 files; it works fine.
With 1000 files, it becomes slow (over 2 minutes), but still works to generate the raw chunks.
Side suggestion: Like other files, it will be nice if even raw chunks are shown in the terminal as they are generated file by file, and not after all have been generated. (So that, we can know these many files have been generated at this point of time, instead of staring at a blank terminal for several minutes).
I will now add more 1000 files and try to build again.
With ~2300 files:
The node-server has started to build.
It started at 11:15, and it failed in 6 minutes 😮
My internet connection is active BTW.
I am trying again. Now, the build has started at 11:24:50
Again, in next 4 minutes, the stackblitz threw the connection error, but my internet is working fine with other websites, so not a connection issue.
Now, let me try on my local PC itself.
Nope, the error is still there with a brand new Nuxt3 app. https://github.com/ManasMadrecha/nuxt-nitro-server-assets-many
So far I only have 3500 JSON files (but each file is a little big ~300 KB).
Now, let me create a new branch (small-files), and remove these big 3500 files and add 12000 smaller JSON files (each ~8KB), so in all 90MB.
This way I can check if the issue is with number of files or the size of the files.
With small 12000 files, I do not get Javascript memory error, but another error: EMFILE: too many open files
Note: both branches (main and small-files) start properly on npm run dev. The errors pop up only on npm run build.
Thank you so much for spending time to make detailed explanation and reproduction ❤️ I have added this issue in my todo list to investigate ASAP. It might be a possible memory leak issue or simply platform limitation for windows concurrency. Will check.
@pi0 Hi, I tweaked the nitropack's node module code a bit, and now it's working... (I tried with 12000 small-files).
I tweaked the code in node_modules/nitropack/shared/nitro.<randomid>.mjs (the big file with ~4800 lines of code)
So, instead of this on Line 60:
const items = (await Promise.all(
files.map(async (file) => {
const path = resolve(dir, file);
const src = await promises.readFile(path);
const size = src.byteLength;
const gzip = options.compressedSizes ? await gzipSize(src) : 0;
return { file, path, size, gzip };
})
)).sort((a, b) => a.path.localeCompare(b.path));
I changed it to this:
// Manas
const items = []
for (const file of files) {
const path = resolve(dir, file);
const src = await promises.readFile(path);
const size = src.byteLength;
const gzip = options.compressedSizes ? await gzipSize(src) : 0;
const fileMapped = { file, path, size, gzip }
items.push(fileMapped)
}
items.sort((a, b) => a.path.localeCompare(b.path))
Now, the app is properly building (even with 12000 small files):
Nice finding! I think we can use a limited queue for parallel fs operations to avoid this.
I opened a PR with using a promise pool to limit open files, similar what to OP did:
https://github.com/unjs/nitro/pull/2145
@pi0