next-http-proxy-middleware
next-http-proxy-middleware copied to clipboard
memory leak?
I was using this module like so:
import httpProxyMiddleware from 'next-http-proxy-middleware';
import { NextApiRequest, NextApiResponse } from 'next';
import { getSession } from '@auth0/nextjs-auth0';
import * as Sentry from '@sentry/node';
import { initSentry } from 'src/utils/sentry';
export default async (req: NextApiRequest, res: NextApiResponse) => {
const session = getSession(req, res);
return new Promise((resolve) =>
httpProxyMiddleware(req, res, {
target: process.env.NEXT_PUBLIC_API_BASE_URL,
headers: { Authorization: `Bearer ${session?.idToken}`, 'Content-Type': 'application/json-patch+json' },
}).catch(async (error) => {
initSentry();
Sentry.captureException(error);
await Sentry.flush(2000);
return resolve(error);
})
);
};
And as you can see in memory usage graph it is spikes and some server restarts. But when I switched to middleware like so
import { NextApiRequest, NextApiResponse } from 'next';
import { getSession } from '@auth0/nextjs-auth0';
import { createProxyMiddleware } from 'http-proxy-middleware';
import * as Sentry from '@sentry/node';
import { initSentry } from 'src/utils/sentry';
export const config = {
api: {
bodyParser: false,
externalResolver: true,
},
};
export default createProxyMiddleware({
target: process.env.NEXT_PUBLIC_API_BASE_URL,
headers: {
'Content-Type': 'application/json-patch+json',
},
selfHandleResponse: false,
changeOrigin: true,
xfwd: true,
secure: true,
proxyTimeout: 3000,
ws: true,
onProxyReq: (proxyReq, req: Request & NextApiRequest, res: Response & NextApiResponse) => {
proxyReq.setHeader('Authorization', `Bearer ${getSession(req, res)?.idToken}`);
},
onError: (err, req: Request & NextApiRequest, res: Response & NextApiResponse) => {
initSentry();
Sentry.captureException(err);
Sentry.flush(2000);
res.end('Something went wrong.');
},
});
the memory usage becomes stable. Also got correct response headers from API like 404

Hi @FDiskas 😀
Thanks for reporting the problem.
I tried to reproduce it in my local environment, but it doesn't work.
When using this middleware, did the memory gradually increase, and was the server restarted by OOM?
Can you elaborate on the attached graph?
Hi, before the memory was slowly increasing over time. Thous spike drop downs was probably because of the restart. The project is running within a docker container in aws. Did not investigated deeper. Just noticed that during development the local server stops after some time and i need to restart it. After changing to simple middleware everything works without strange crashes. Thous graphs is taken from aws instance metrics.
@FDiskas
Thank you for the explanation. 😀
Could you tell me what version of next-http-proxy-middleware you are currently using for your project? Prior to v1.0.10, memory leaks may have occurred when using HTTP HEAD, DELETE, CONNECT, OPTIONS, and PUT methods. 😭
It is not yet accurate, but it seems that system memory accumulates in the inspector when repeatedly requested several times using the ab test.. I'll check a little more. 🤔 Thanks for reporting.
As I noticed that there is a new release I updated as soon as possible but didn't noticed any improvement regarding memory
As I noticed that there is a new release I updated as soon as possible but didn't noticed any improvement regarding memory
😭
Could be related https://github.com/stegano/next-http-proxy-middleware/issues/21
If possible, try applying the code like this: (The problem I've identified is that when a promise(async) is exported, the memory increases with the increase in requests.)
// as-is
export default async (req: NextApiRequest, res: NextApiResponse) => {
const session = getSession(req, res);
return new Promise((resolve) =>
httpProxyMiddleware(req, res, {
target: process.env.NEXT_PUBLIC_API_BASE_URL,
headers: { Authorization: `Bearer ${session?.idToken}`, 'Content-Type': 'application/json-patch+json' },
}).catch(async (error) => {
initSentry();
Sentry.captureException(error);
await Sentry.flush(2000);
return resolve(error);
})
);
};
// to-be ✨
export default (req: NextApiRequest, res: NextApiResponse) => {
const session = getSession(req, res);
return httpProxyMiddleware(req, res, {
target: process.env.NEXT_PUBLIC_API_BASE_URL,
headers: { Authorization: `Bearer ${session?.idToken}`, 'Content-Type': 'application/json-patch+json' },
}).catch(async (error) => {
initSentry();
Sentry.captureException(error);
await Sentry.flush(2000);
});
};
Test result
Request sent 4000 times
// Test code
export default (req: NextApiRequest, res: NextApiResponse) => httpNextProxyMiddleware(req, res, {
target: 'http://localhost:8080',
}).catch(async (error) => console.log(error));
- No continuous memory growth
Request sent 4000 times
- Memory usage is constantly increasing
// Test code
export default async (req: NextApiRequest, res: NextApiResponse) => new Promise(
(resolve) => httpNextProxyMiddleware(req, res, {
target: 'http://localhost:8080',
}).catch(async (error) => resolve(error)),
);
I will try at Monday let you know
Looks like still something is wrong.
import { NextApiRequest, NextApiResponse } from 'next';
import { getSession } from '@auth0/nextjs-auth0';
import httpProxyMiddleware from 'next-http-proxy-middleware';
import * as Sentry from '@sentry/node';
import { initSentry } from 'src/utils/sentry';
export const config = {
api: {
bodyParser: false,
externalResolver: true,
},
};
export default async (req: NextApiRequest, res: NextApiResponse) => {
const session = getSession(req, res);
try {
return await httpProxyMiddleware(req, res, {
target: process.env.NEXT_PUBLIC_API_BASE_URL,
headers: {
Authorization: `Bearer ${session?.idToken}`,
'Content-Type': 'application/json-patch+json',
},
});
} catch (error: any) {
initSentry();
Sentry.captureException(error);
await Sentry.flush(2000);
return error?.message;
}
};
And after some time I get crash
node_modules/http-proxy/lib/http-proxy/index.js:120
throw err;
^
Error: socket hang up
at connResetException (internal/errors.js:617:14)
at TLSSocket.socketCloseListener (_http_client.js:443:25)
at TLSSocket.emit (events.js:327:22)
at TLSSocket.EventEmitter.emit (domain.js:486:12)
at net.js:673:12
at TCP.done (_tls_wrap.js:563:7) {
code: 'ECONNRESET'
Hi @FDiskas,
Please do not return promise(await function) 😭 Is there any reason to use the async~await?
export default async (req: NextApiRequest, res: NextApiResponse) => { // <-- `fix to export default (req: NextApiRequest, res: NextApiResponse)`
const session = getSession(req, res);
try {
return await httpProxyMiddleware(req, res, { // <-- fix to `return httpProxyMiddleware`
target: process.env.NEXT_PUBLIC_API_BASE_URL,
headers: {
Authorization: `Bearer ${session?.idToken}`,
'Content-Type': 'application/json-patch+json',
},
});
} catch (error: any) {
initSentry();
Sentry.captureException(error);
await Sentry.flush(2000);
return error?.message;
}
};
The reason is simple. Eslint rules does that. :) But I will check that later
Tested - still memory is increasing - I used code like so
export const config = {
api: {
bodyParser: false,
externalResolver: true,
},
};
export default (req: NextApiRequest, res: NextApiResponse) => {
return httpProxyMiddleware(req, res, {
preserveHeaderKeyCase: true,
target: process.env.NEXT_PUBLIC_API_BASE_URL,
timeout: 3000,
headers: {
'Content-Type': 'application/json-patch+json',
},
}).catch(async (error) => {
res.end(error.message);
});
};
I builded app and then started it using
NODE_OPTIONS='--inspect' next start
attached chrome inspector and I compared between memory snapshots and the main issue is pointing to next-http-proxy-middleware

And after reverting it back to http-proxy-middleware I got something better. It is still growing but this time a different plugin.

Tested - still memory is increasing - I used code like so
export const config = { api: { bodyParser: false, externalResolver: true, }, }; export default (req: NextApiRequest, res: NextApiResponse) => { return httpProxyMiddleware(req, res, { preserveHeaderKeyCase: true, target: process.env.NEXT_PUBLIC_API_BASE_URL, timeout: 3000, headers: { 'Content-Type': 'application/json-patch+json', }, }).catch(async (error) => { res.end(error.message); }); };I builded app and then started it using
NODE_OPTIONS='--inspect' next startattached chrome inspector and I compared between memory snapshots and the main issue is pointing to
next-http-proxy-middleware
JS memory can grow and shrink while the process is running. Over time, unused memory is removed by the GC. In the case above, is there an error response when requesting through a proxy? If you are getting an http error response, clear async in the catch clause and try again.
export default (req: NextApiRequest, res: NextApiResponse) => {
return httpProxyMiddleware(req, res, {
preserveHeaderKeyCase: true,
target: process.env.NEXT_PUBLIC_API_BASE_URL,
timeout: 3000,
headers: {
'Content-Type': 'application/json-patch+json',
},
}).catch((error) => { // <!-- here
res.end(error.message);
});
};
If the catch does not return a promise result, the promise will not be removed by the GC and will persist.
Thank you for good explanation. In this case no errors in requests was thrown. But as you said probably I need to ignore such small memory changes. Can we have a good working example in a repository?
Thank you for good explanation. In this case no errors in requests was thrown. But as you said probably I need to ignore such small memory changes. Can we have a good working example in a repository?
This library can be used in a variety of ways, making it difficult to give specific examples. However, if a good reference is recorded like this issue, I think I can refer to it later.