Memory leak with Node.js 20+
Prerequisites
- [x] I confirm my issue is not in the opened issues
- [x] I confirm the Frequently Asked Questions didn't contain the answer to my issue
Environment check
- [x] I'm using the latest
mswversion - [x] I'm using Node.js version 20 or higher
Node.js version
20.13.0
Reproduction repository
https://github.com/leaveswoods/msw-mem-leak
Reproduction steps
- Clone the repo: https://github.com/leaveswoods/msw-mem-leak
- Install Node.js 20
- Under test project root
- Run
./runMswMinimalTest.sh - Check the logs for heap usage before and after the tests.
Current behavior
The heap usage is increasing significantly after test with msw even though
- we force gc after test completed.
- we close msw server in the end.
Expected behavior
The heap usage should not increase overtime with msw enabled.
We are also running into this with similar symptoms as #2405
We are also running into this with similar symptoms as #2405
I also checked this thread and try to use 2.8.5 as suggested, but no luck. Latest version also does not work.
@kettanaito We’d love your input here, as this issue seems closely related to #2405.
We’re currently using MSW in a long-lived server context to mock a few external services in a load-testing environment. Unfortunately, the memory leak is causing our servers to exhaust memory over time, which skews our test results.
If you have any thoughts, guidance, or suggested workarounds, they’d be greatly appreciated. Thanks for all your work on this!
Hi, @leaveswoods. Thanks for reporting this (and for the reproduction repository). I will take a look at this once I have a moment. I encourage others to dive into this, too, following the reproduction steps and analyzing the heap map/other artifacts to see what's accumulating memory. Share your findings here so we could discuss the ways to fix the issue. Thanks.
I'm trying to put the reproduction scenario in an automated test (see #2542) but the memory usage delta I'm getting is always high, even when not using MSW. Does anybody see any obvious errors with this test? Would appreciate any feedback here. Making an automated test out of this is the first step to fixing the issue.
Hi @kettanaito thanks for replying us. I think your test is fine. I copy your test code, and create two js files, with and without msw
import http from 'node:http'
import { setupServer } from 'msw/node'
async function profile() {
const startMemoryUsage = process.memoryUsage().heapUsed
for (let i = 0; i < 1000; i++) {
const server = setupServer()
server.listen({ onUnhandledRequest: 'bypass' })
await new Promise((resolve) => {
http
.get('http://localhost/non-existent', () => resolve())
.on('error', () => resolve())
})
server.close()
}
const endMemoryUsage = process.memoryUsage().heapUsed
const memoryUsed = endMemoryUsage - startMemoryUsage
global.gc?.()
console.log(memoryUsed / 1024 / 1024, 'MB')
}
profile()
import http from 'node:http'
async function profile() {
const startMemoryUsage = process.memoryUsage().heapUsed
for (let i = 0; i < 1000; i++) {
await new Promise((resolve) => {
http
.get('http://localhost/non-existent', () => resolve())
.on('error', () => resolve())
})
}
const endMemoryUsage = process.memoryUsage().heapUsed
const memoryUsed = endMemoryUsage - startMemoryUsage
global.gc?.()
console.log(memoryUsed / 1024 / 1024)
}
profile()
And I run two js files separately multiple times, you can see from screenshot, the memory delta is much higher when msw is in used.
➜ msw git:(main) ✗ node withMsw.js
6.177215576171875 MB
➜ msw git:(main) ✗ node withoutMsw.js
1.280303955078125
➜ msw git:(main) ✗ node withMsw.js
9.632209777832031 MB
➜ msw git:(main) ✗ node withoutMsw.js
0.668487548828125
➜ msw git:(main) ✗ node withMsw.js
7.8651275634765625 MB
➜ msw git:(main) ✗ node withoutMsw.js
0.6793899536132812
➜ msw git:(main) ✗
And I found an interesting thing, different node version can have different behavior here.
My apology, the above test was run on node 18. But I have run the test again in node 20, still have memory leak, but is better than node 18.
My current solution is switching to use another library "nock": "^13.5.6", which does not have memory leak issue.
However, nock >= 14 also has memory leak issue, and I think it is likely because nock >= 14 starts to use @mswjs/interceptors, while nock 13 does not.
Hope this can help you for debugging @kettanaito .
Thanks, @leaveswoods. The root cause it likely in Interceptors and how we handle the socket connection.
It's a bit hard to have a reproducible test if both scenarios consume some memory, the faulty one just happens to consume more... I will try to wrap my head about this, but your insight helped, thank you.
As to why different Node.js versions reflect differently on this issue is likely because we are tapping into the native net and its sockets, and Node.js might have different behaviors there across versions.
We encountered a crash when upgrading msw. Through a binary search, I identified that the issue is caused by https://github.com/mswjs/msw/pull/2508 (which includes @mswjs/interceptors update). I also confirmed a memory leak using --logHeapUsage option.
We are using node 22, the error message is as follows.
node: ../deps/uv/src/unix/stream.c:456: uv__stream_destroy: Assertion `!uv__io_active(&stream->io_watcher, POLLIN | POLLOUT)' failed.
Hope this helps for further investigation.
This is the heapsnapshot comparisons sorted by size delta.
@rajyan, thanks! Could you please share the steps you took to generate that snapshot? If I can get it, it will help a lot in tracking down the culprit.
Meanwhile, I did a simple unresolved promise check and found no unresolved promises. I think the way I'm creating a heap profile isn't correct, that's why I'm not seeing that memory leak.
One potential cause for this is here.
http.request -> ClientRequest(mockAgent) -> MockHttpAgent(mockAgent)
If a ClientRequest is created with a MockHttpAgent instance as the value for a custom agent request option, that creates a mock agent over a mock agent, which might be causing excessive memory consumption.
But upon debugging, the ClientRequest proxy isn't called when making requests via http.request().
I don't believe #2508 is where the issue was introduced as that PR itself attempts to address a memory leak (#2405). That hints that the problem existed before #2508 and we should look further down the release history to see what introduced it.
As always, any help is appreciated here.
@kettanaito
I don't believe https://github.com/mswjs/msw/pull/2508 is where the issue was introduced as that PR itself attempts to address a memory leak (https://github.com/mswjs/msw/issues/2405). That hints that the problem existed before https://github.com/mswjs/msw/pull/2508 and we should look further down the release history to see what introduced it.
That makes sense. Thank you for pointing that out! I’ll look further back in the release history as well.
Could you please share the steps you took to generate that snapshot? If I can get it, it will help a lot in tracking down the culprit.
I added code in our test suite to take a memory heap snapshot right before it crashes due to the memory leak, and then compared the snapshots across different versions. Since it involves some private code, I’m not able to share the exact steps right away, but I’ll try to put together something shareable when I have time!
Understandable! Always be cautious when sharing this kind of things and, ideally, never do and use a minimal reproduction instead.
We have these failures in CI as well when upgrading msw from 2.8.2 to 2.11.1. I then checked the release notes for when interceptors were updated last and tried that release (2.10.0) but sadly still getting the crashes.
node: ../deps/uv/src/unix/stream.c:456: uv__stream_destroy: Assertion `!uv__io_active(&stream->io_watcher, POLLIN | POLLOUT)' failed.
Summary of all failing tests
FAIL components/ConnectedUserCard/__tests__/ConnectedUserCard.test.tsx
● Test suite failed to run
A jest worker process (pid=106) was terminated by another process: signal=SIGABRT, exitCode=null. Operating system logs may contain more information on why this occurred.
at ChildProcessWorker._onExit (../../node_modules/jest-worker/build/workers/ChildProcessWorker.js:370:23)
Test Suites: 1 failed, 7 skipped, 367 passed, 368 of 375 total
And I can also confirm that the crash is gone when downgrading back to 2.8.4 again which also downgrades @mswjs/interceptors to 0.37.6 again.
Thanks for verifying that, @Stanzilla. I suspect that the root cause is somewhere in the Socket-based interceptor, but the real question is where.
Thanks for verifying that, @Stanzilla. I suspect that the root cause is somewhere in the Socket-based interceptor, but the real question is where.
Do you have copilot set up for this repo? We could let it investigate
@Stanzilla, I don't believe I ever configured Copilot for this repo. You can open it on github.dev or on your local machine. Any kind of help with this issue is hugely appreciated!
@Stanzilla, I don't believe I ever configured Copilot for this repo. You can open it on github.dev or on your local machine. Any kind of help with this issue is hugely appreciated!
Alright, you can enable that in the repo settings if you ever want to. I just let it ran and here is what it spit out, I have not verified anything:
Memory Leak Investigation & Analysis
I've investigated the memory leak issue reported in the Socket-based interceptor for Node.js 20+ and can confirm the problem exists. Here are my findings:
Root Cause Identified
The memory leak occurs in the MockHttpSocket class when operating in "passthrough" mode (when requests are bypassed rather than mocked). The issue is in src/interceptors/ClientRequest/MockHttpSocket.ts:
Problem: When a request is bypassed, the interceptor:
- Creates a real Node.js socket via
this.createConnection() - Attaches multiple event listeners to that socket (
lookup,connect,data,error,finish,close, etc.) - Never properly removes these event listeners when the request completes
This causes event listeners to accumulate on the underlying Node.js sockets, leading to significant memory growth over time.
Reproduction & Testing
I created a test that confirms the memory leak:
// test/modules/http/regressions/http-socket-memory-leak.test.ts
it('properly cleans up socket resources on passthrough requests', async () => {
let socketListenerCount = 0
// Track socket listener counts on real Node.js sockets
const originalOn = require('net').Socket.prototype.on
require('net').Socket.prototype.on = function(event: string, listener: Function) {
if (this.constructor.name !== 'MockHttpSocket') {
socketListenerCount++
}
return originalOn.call(this, event, listener)
}
// ... make 10 bypassed requests ...
// Without fix: listener count grows by ~90-105
// With proper cleanup: should grow by <50
expect(listenerGrowth).toBeLessThan(50)
})
Results: The test shows listener counts growing by 90-105 per 10 requests, confirming significant listener accumulation.
Proposed Solution
I've implemented a comprehensive fix that:
Tracks all event listeners added to the original socket
Implements proper cleanup when requests complete
Triggers cleanup at multiple lifecycle points to ensure it happens
Key changes:
```ts
// Track listeners for cleanup
private originalSocketListeners: Map<string, Function> = new Map()
// In passthrough method - store listener references
for (const [event, listener] of Object.entries(listeners)) {
this.originalSocketListeners.set(event, listener)
socket.on(event as any, listener)
}
// Cleanup method
private cleanupOriginalSocket(): void {
if (this.originalSocket && this.socketState === 'passthrough') {
for (const [event, listener] of this.originalSocketListeners) {
this.originalSocket.removeListener(event, listener as (...args: any[]) => void)
}
this.originalSocketListeners.clear()
}
}
// Trigger cleanup in multiple events
finish: () => {
this.cleanupOriginalSocket()
this.emit('finish')
},
close: (hadError: any) => {
this.cleanupOriginalSocket()
this.emit('close', hadError)
},
end: () => {
this.cleanupOriginalSocket()
this.emit('end')
}
Current Status The fix addresses the core issue by ensuring event listeners are properly removed. However, the test is still showing some listener accumulation, suggesting there may be additional edge cases or timing issues to resolve.
This provides a solid foundation for fixing the memory leak, though further refinement may be needed to handle all scenarios (socket pooling, keep-alive connections, etc.).
Impact This memory leak affects any application using MSW with Node.js 20+ where requests are frequently bypassed to the original server, which is a common pattern in testing scenarios.
Would the maintainers be interested in a PR with this fix? I can refine the solution further and ensure all test cases pass before submitting.
@kettanaito did that help in any way?
@Stanzilla, yes, your investigation was extremely helpful! Does the current memory leak test represents the issue clearly? If so, we could try the fix you're proposing against that test to see if it help solve the problem.
The solution can be as simple as adding a close listener to the passthrough socket and removing all listeners from the socket there:
socket
// ...other existing listeners.
.on('close', () => socket.removeAllListeners())
The close event is similar to "finally"—it will be emitted when the socket is no longer needed or has errored. Would you like to give this solution a try? A pull request is highly welcome.
Hi @Stanzilla @kettanaito I just recently updated to latest MSW version and able to reproduce this issue, with node v22.15.0. Do you know if we could have a solution for this in the upcoming versions? Thanks you so much for the support!
Sorry @kettanaito and @reysmerwvr I was/am on vacation and did not really have time, if you want, Reysmer, you can take what we pointed out and make a PR for it?
Thanks for this thread, folks. We'd been hitting the same problem updating to node 20, 22 and 24. Pinning to [email protected] and @mswjs/[email protected] allowed us to circumvent it. Looking forward to the proper fix.
Update
First of all, huge thank you to everyone who participated in investigating this.
The memory leak should be resolved by these two fixes:
- https://github.com/mswjs/interceptors/pull/757
- https://github.com/mswjs/interceptors/pull/755
We were not freeing the internal HTTP parsers correctly. We also were not even handling the socket closure propagation from/to the passthrough socket at all (incorrectly listening only to the error event, which ignored the passthrough socket cleanup on graceful mock socket closures).
I trust these two fixes should, if not fix the issue, dramatically reduce the memory consumption by Interceptors.
Release of Interceptors is blocked by https://github.com/mswjs/interceptors/pull/759 and the module shenanigans. Will come back to that sometime after the holidays.