node-server icon indicating copy to clipboard operation
node-server copied to clipboard

Using hono-node with the cors middleware causes a memory leak

Open krzkz94 opened this issue 4 months ago • 24 comments

What version of Hono are you using?

4.8.12

What runtime/platform is your app running on? (with version if possible)

Node

What steps can reproduce the bug?

Clone this repo https://github.com/krzkz94/hono-context-memory-leak

Follow the outlined steps (install, run app, run autocannon)

Notice heap size increasing

npx tsx src/index.ts
Server is running on http://localhost:3000
{ before: '9.09', after: '9.09' }
{ before: '9.11', after: '9.11' }
{ before: '51.49', after: '51.50' }
{ before: '67.14', after: '67.14' }
{ before: '111.91', after: '111.91' }
{ before: '155.77', after: '155.77' }
{ before: '126.81', after: '126.81' }
{ before: '162.79', after: '162.79' }
{ before: '207.94', after: '207.94' }
{ before: '254.23', after: '254.23' }
{ before: '302.14', after: '302.14' }
{ before: '342.52', after: '342.52' }
{ before: '380.52', after: '380.52' }
{ before: '248.96', after: '248.96' }
{ before: '288.23', after: '288.23' }
{ before: '339.07', after: '339.07' }
{ before: '382.03', after: '382.03' }
{ before: '424.34', after: '424.34' }
{ before: '458.85', after: '458.85' }
{ before: '511.55', after: '511.55' }
{ before: '552.79', after: '552.79' }
{ before: '596.38', after: '596.38' }
{ before: '590.64', after: '590.64' }

Node22

What is the expected behavior?

Heap size stays consistent with minimal increases, for example, below, it's the result without the cors middleware.

tsx watch src/index.ts

Server is running on http://localhost:3000
{ before: '17.41', after: '17.41' }
{ before: '11.33', after: '11.33' }
{ before: '10.58', after: '10.58' }
{ before: '23.65', after: '23.65' }
{ before: '14.78', after: '14.78' }
{ before: '17.13', after: '17.13' }
{ before: '16.21', after: '16.21' }
{ before: '16.23', after: '16.23' }
{ before: '13.44', after: '13.45' }
{ before: '14.90', after: '14.90' }
{ before: '21.29', after: '21.29' }
{ before: '18.71', after: '18.71' }
{ before: '14.91', after: '14.91' }
{ before: '13.46', after: '13.46' }

What do you see instead?

Heap size increases and stays high even after the test ends

{ before: '8.94', after: '8.94' }
{ before: '8.96', after: '8.96' }
{ before: '30.07', after: '30.07' }
{ before: '77.18', after: '77.18' }
{ before: '87.71', after: '87.71' }
{ before: '131.37', after: '131.37' }
{ before: '164.34', after: '164.34' }
{ before: '206.36', after: '206.36' }
{ before: '157.07', after: '157.07' }
{ before: '189.13', after: '189.13' }
{ before: '232.51', after: '232.51' }
{ before: '280.68', after: '280.68' }
{ before: '328.30', after: '328.30' }
{ before: '366.00', after: '366.00' }
{ before: '410.61', after: '410.61' }
{ before: '461.95', after: '461.95' }
{ before: '491.38', after: '491.38' }
{ before: '314.48', after: '314.48' }
{ before: '363.24', after: '363.24' }
{ before: '412.60', after: '412.60' }
{ before: '451.58', after: '451.58' }
{ before: '492.87', after: '492.87' }
{ before: '509.95', after: '509.95' } <- end of test
{ before: '509.95', after: '509.95' }
{ before: '509.96', after: '509.96' }
{ before: '509.96', after: '509.96' }
{ before: '509.97', after: '509.97' }
{ before: '509.97', after: '509.97' }
{ before: '509.97', after: '509.98' }
{ before: '509.98', after: '509.98' }
{ before: '509.99', after: '509.99' }
{ before: '509.99', after: '509.99' }
{ before: '509.99', after: '510.00' }

Additional information

It seems a bug with node-server and how things are handled when accessing the headers, running this with Bun's native HTTP server, doesn't cause issues. (bun --expose-gc src/index.ts) - also, removing the serve() wrapper around the object and just exporting default.

Basically accessing c.req.header('origin') (or any header for the matter), builds memory pressure, and under constant load, keeps increasing the memory requirement without ever giving it a chance to be cleaned up until either (a) the load stops or (b) the application runs out of memory

krzkz94 avatar Aug 04 '25 22:08 krzkz94

I'm facing the same issue, my server is crashing every 10 mins.

omarkhatibco avatar Aug 05 '25 15:08 omarkhatibco

@omarkhatibco

I'll work on investigating it. One question, are you running your app on Node.js? or other runtime?

yusukebe avatar Aug 05 '25 19:08 yusukebe

Hey @yusukebe ,

It's a simple node.js server with 1 endpoint.

you can see in the image the memory leak, it starts to happens yesterday when I merged my PR which includes latest version of Hono v4.8.12

The workaround what to disable cors plugin and just use normal headers in the response.

Image

omarkhatibco avatar Aug 05 '25 19:08 omarkhatibco

@omarkhatibco

it starts to happens yesterday when I merged my PR which includes latest version of Hono v4.8.12

Does this mean the previous version you used did not cause a memory leak? Let me know the previous version.

yusukebe avatar Aug 05 '25 19:08 yusukebe

@yusukebe I upgraded from 4.7.9 as I experienced the issue there and it did not go away with 4.8.12 - mind you, I had some other issues in my app with memory management and managed to fix all of them except this one so I don't want to lead you the wrong way by saying that it was there already in 4.7.9

krzkz94 avatar Aug 05 '25 19:08 krzkz94

@krzkz94 Okay, thanks!

yusukebe avatar Aug 05 '25 19:08 yusukebe

@krzkz94

What version of @hono/node-server do you use?

yusukebe avatar Aug 05 '25 19:08 yusukebe

Hey @yusukebe

I just noticed that the issue start to appear on 25th on July, I couldn't test which one, but I think that last one I had no problem with was 4.8.5 and @hono/node-server was on 1.17.0

you can see the image below

Image

omarkhatibco avatar Aug 05 '25 20:08 omarkhatibco

@yusukebe Should've mentioned it, my bad

"@hono/node-server": "1.18.1", "hono": "4.8.12"

krzkz94 avatar Aug 05 '25 20:08 krzkz94

@omarkhatibco @krzkz94 Thanks!

yusukebe avatar Aug 05 '25 20:08 yusukebe

This is a bug in @hono/node-server. You can reproduce it with the following code without hono. Should be fixed. I'll transfer this issue to the honojs/node-server repository.

import { serve } from '@hono/node-server'

setInterval(() => {
  const before = process.memoryUsage().heapUsed / 1024 / 1024
  global.gc?.()
  const after = process.memoryUsage().heapUsed / 1024 / 1024
  console.log({ before: before.toFixed(2), after: after.toFixed(2) })
}, 1_000)

serve(
  {
    fetch: (req) => {
      req.headers.get('foo') // <===
      return new Response('foo')
    },
    port: 3000
  },
  (info) => {
    console.log(`Server is running on http://localhost:${info.port}`)
  }
)

yusukebe avatar Aug 05 '25 20:08 yusukebe

Hi @usualoma

Sorry for again and again.

This memory leak is caused by introducing a lightweight request in the Node.js Adapter. As proof, the problem does not occur with version 1.2.3 before 1.3.0 that includes the lightweight request feature.

Here is the minimal repro: https://github.com/yusukebe/hono-nodejs-adapter-memory-leak

Can you investigate it?

yusukebe avatar Aug 05 '25 20:08 yusukebe

I'll check. Please wait a moment.

usualoma avatar Aug 05 '25 21:08 usualoma

The same problem that was fixed below is occurring.

https://github.com/honojs/node-server/pull/172

Changing the following will prevent memory leaks.

diff --git i/src/request.ts w/src/request.ts
index 61664fb..f5f0f19 100644
--- i/src/request.ts
+++ w/src/request.ts
@@ -62,7 +62,7 @@ const newRequestFromIncoming = (
   const init = {
     method: method,
     headers: headerRecord,
-    signal: abortController.signal,
+    _signal: abortController.signal, // Do not pass signal
   } as RequestInit
 
   if (method === 'TRACE') {

usualoma avatar Aug 05 '25 21:08 usualoma

Based on my investigation, I have come to the following conclusion. There is room for adjustment, but I do not believe this is a memory leak in node-server.

global.gc

To use this, you need to specify the --expose-gc option.

$ node -e 'console.log(global.gc)'
undefined
$ node --expose-gc  -e 'console.log(global.gc)'
[Function: gc]

If you make the following changes, you should be able to confirm that memory usage does not continue to increase.

diff --git i/package.json w/package.json
index c1f8661..09f02b3 100644
--- i/package.json
+++ w/package.json
@@ -2,9 +2,9 @@
   "name": "hono-nodejs-adapter-memory-leak",
   "type": "module",
   "scripts": {
-    "dev": "tsx watch src/index.ts",
+    "dev": "NODE_OPTIONS=--expose-gc tsx watch src/index.ts",
     "build": "tsc",
-    "start": "node dist/index.js"
+    "start": "node --expose-gc dist/index.js"
   },
   "dependencies": {
     "@hono/node-server": "^1.18.1"

Improvements when using cors middleware

In the following comment, it was mentioned that simply using cors caused a fallback to the original globalThis.Request, which triggered the problem. Therefore, in the case of cors (or rather, simple access to the header), it may be better to optimize node-server further.

https://github.com/honojs/node-server/issues/266#issuecomment-3146193711

usualoma avatar Aug 06 '25 00:08 usualoma

Hi @usualoma, we've created a patch with the changes outlined here

+    _signal: abortController.signal, // Do not pass signal

Basically, renaming the signal variable (I assume so it can't be referenced somewhere else) has solved our issue with the memory leak when using the cors middleware. Memory is released even under pressure and returns to expected levels even over longer periods of time (hours)

Attached the patch below for anyone who might wanna try it out to see if it fixes their issue (s) @hono%[email protected]

krzkz94 avatar Aug 06 '25 08:08 krzkz94

Any updates?

tgdn avatar Aug 29 '25 14:08 tgdn

Within the scope I verified, explicitly executing global.gc() seemed to free the memory.

I created a PR that explicitly executes global.gc() every N times when the --expose-gc and requestsPerForcedGC options are specified. Could someone verify if this resolves the issue?

https://github.com/honojs/node-server/pull/283

usualoma avatar Oct 19 '25 11:10 usualoma

@usualoma the memory logger here https://github.com/krzkz94/hono-context-memory-leak/blob/main/src/index.ts#L8 does run global.gc (not every n request, but on a per-second basis), but if memory pressure is high (constant flow of requests, production system) gc never runs until it runs out of memory.

Accessing headers seems to cause this problem, eventually the memory will drop, but never to baseline (where it started), with the patch you suggested (patching signal/abordController), it works perfectly, and we have been running it in production currently with no issues over the last 3+ months

krzkz94 avatar Oct 21 '25 10:10 krzkz94

Hi @krzkz94, thanks for your reply!

Just to confirm: As shown in https://github.com/honojs/node-server/issues/269#issuecomment-3157022279, the --expose-gc flag is required to execute global.gc(). Did you run it with this flag? (It wasn't specified in the repository you shared.)

Based on my local tests, when the --expose-gc flag is specified, GC appears to be executed.

usualoma avatar Oct 21 '25 11:10 usualoma

I think there's also a way to do it optionally like this,

+    _signal: abortController.signal, // Do not pass signal

Before that, I want to verify operation in an environment where I can reproduce the memory leak (unrecovered even after GC runs).

usualoma avatar Oct 21 '25 11:10 usualoma

Hmm, after digging into this for a while, I don’t think this is an actual memory leak.

Because of how Undici is implemented, the AbortController objects tend to get promoted into the old generation, so unless you explicitly call global.gc(), they’re not collected (or at least not collected quickly). https://github.com/nodejs/undici/blob/92715645444a41d3d6b3be0a458630cf32150adf/lib/web/fetch/request.js#L33-L35

What I found is that we can actually work around this by setting --max-old-space-size, like this:

diff --git i/package.json w/package.json
index c1f8661..8677704 100644
--- i/package.json
+++ w/package.json
@@ -4,7 +4,7 @@
   "scripts": {
     "dev": "tsx watch src/index.ts",
     "build": "tsc",
-    "start": "node dist/index.js"
+    "start": "node --max-old-space-size=1024 dist/index.js"
   },
   "dependencies": {
     "@hono/node-server": "^1.18.1"

usualoma avatar Nov 02 '25 07:11 usualoma

As a practical workaround for this, I suppose it would be feasible to enable something equivalent via an option. However, I still consider this merely a symptomatic solution, so I'm rather torn about whether it should be implemented.

+ _signal: abortController.signal, // Do not pass signal

usualoma avatar Nov 02 '25 07:11 usualoma

Hi @krzkz94, thanks for your reply!

Just to confirm: As shown in #269 (comment), the --expose-gc flag is required to execute global.gc(). Did you run it with this flag? (It wasn't specified in the repository you shared.)

Based on my local tests, when the --expose-gc flag is specified, GC appears to be executed.

Yes, --expose-gc was passed as well as global.gc() is called

https://github.com/krzkz94/hono-context-memory-leak/blob/main/src/index.ts#L8

File was executed with bun --expose-gc src/index.ts

krzkz94 avatar Nov 02 '25 20:11 krzkz94