node-fetch
node-fetch copied to clipboard
Memory leak in v3.1.0
node-fetch: 3.1.0
This code produces memory leak warning:
import fetch from "node-fetch";
import { Agent } from "http";
const agent = new Agent( {
"maxSockets": 1,
} );
for ( let n = 0; n < 30; n++ ) {
fetch( "http://example.com/", { agent } ).then( res => console.log( res.status ) );
}
200
200
(node:3152) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [Socket]. Use emitter.setMaxListeners() to increase limit
at _addListener (node:events:465:17)
at Socket.prependListener (node:events:501:14)
at ClientRequest.<anonymous> (file:///d:/downloads/111/node_modules/node-fetch/src/index.js:319:10)
at ClientRequest.emit (node:events:390:28)
at tickOnSocket (node:_http_client:757:7)
at onSocketNT (node:_http_client:820:5)
at processTicksAndRejections (node:internal/process/task_queues:84:21)
200
(node:3152) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 data listeners added to [Socket]. Use emitter.setMaxListeners() to increase limit
at _addListener (node:events:465:17)
at Socket.addListener (node:events:487:10)
at Socket.Readable.on (node:internal/streams/readable:899:35)
at ClientRequest.<anonymous> (file:///d:/downloads/111/node_modules/node-fetch/src/index.js:325:10)
at ClientRequest.emit (node:events:390:28)
at tickOnSocket (node:_http_client:757:7)
at onSocketNT (node:_http_client:820:5)
at processTicksAndRejections (node:internal/process/task_queues:84:21)
200
200
200
200
200
200
200
That doesn't really look like a memory leak. You're calling fetch
multiple times without await
so multiple fetch calls are running at the same time. This means it doesn't have a chance to close off the last connection before the next one opens.
The following uses await to ensure only one is called at a time.
import fetch from "node-fetch";
import { Agent } from "http";
const agent = new Agent( {
"maxSockets": 1,
});
async function* asyncGenerator() {
let i = 0;
while (i < 30) {
yield i++;
}
}
for await (const i of asyncGenerator()) {
await fetch( "http://example.com/", { agent } ).then( res => console.log( res.status ) );
}
Need to increase / decrease number of maxListeners before set event listener. Or review logic and set listener only after request will be sent, not after it was queued.
@OmgImAlexis even in your example you are fetching multiple request simultaneously. getting the response headers can be very fast... but getting the actually body can be slower...
for await (const i of asyncGenerator()) {
- await fetch( "http://example.com/", { agent } ).then( res => console.log( res.status ) );
+ const res = await fetch( "http://example.com/", { agent } )
+ console.log( res.status )
+ await res.arrayBuffer() // Wait for body to arrive
}
related to: https://github.com/node-fetch/node-fetch/issues/1446
Probably fixed by #1474.