node-fetch icon indicating copy to clipboard operation
node-fetch copied to clipboard

Memory leak in v3.1.0

Open zdm opened this issue 3 years ago • 5 comments

node-fetch: 3.1.0

This code produces memory leak warning:

import fetch from "node-fetch";
import { Agent } from "http";

const agent = new Agent( {
    "maxSockets": 1,
} );

for ( let n = 0; n < 30; n++ ) {
    fetch( "http://example.com/", { agent } ).then( res => console.log( res.status ) );
}
200
200
(node:3152) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [Socket]. Use emitter.setMaxListeners() to increase limit
    at _addListener (node:events:465:17)
    at Socket.prependListener (node:events:501:14)
    at ClientRequest.<anonymous> (file:///d:/downloads/111/node_modules/node-fetch/src/index.js:319:10)
    at ClientRequest.emit (node:events:390:28)
    at tickOnSocket (node:_http_client:757:7)
    at onSocketNT (node:_http_client:820:5)
    at processTicksAndRejections (node:internal/process/task_queues:84:21)
200
(node:3152) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 data listeners added to [Socket]. Use emitter.setMaxListeners() to increase limit
    at _addListener (node:events:465:17)
    at Socket.addListener (node:events:487:10)
    at Socket.Readable.on (node:internal/streams/readable:899:35)
    at ClientRequest.<anonymous> (file:///d:/downloads/111/node_modules/node-fetch/src/index.js:325:10)
    at ClientRequest.emit (node:events:390:28)
    at tickOnSocket (node:_http_client:757:7)
    at onSocketNT (node:_http_client:820:5)
    at processTicksAndRejections (node:internal/process/task_queues:84:21)
200
200
200
200
200
200
200

zdm avatar Nov 26 '21 14:11 zdm

That doesn't really look like a memory leak. You're calling fetch multiple times without await so multiple fetch calls are running at the same time. This means it doesn't have a chance to close off the last connection before the next one opens.

The following uses await to ensure only one is called at a time.

import fetch from "node-fetch";
import { Agent } from "http";

const agent = new Agent( {
    "maxSockets": 1,
});

async function* asyncGenerator() {
  let i = 0;
  while (i < 30) {
    yield i++;
  }
}

for await (const i of asyncGenerator()) {
    await fetch( "http://example.com/", { agent } ).then( res => console.log( res.status ) );
}

OmgImAlexis avatar Dec 08 '21 04:12 OmgImAlexis

Need to increase / decrease number of maxListeners before set event listener. Or review logic and set listener only after request will be sent, not after it was queued.

zdm avatar Dec 14 '21 11:12 zdm

@OmgImAlexis even in your example you are fetching multiple request simultaneously. getting the response headers can be very fast... but getting the actually body can be slower...

for await (const i of asyncGenerator()) {
-   await fetch( "http://example.com/", { agent } ).then( res => console.log( res.status ) );
+   const res = await fetch( "http://example.com/", { agent } )
+   console.log( res.status )
+   await res.arrayBuffer()  // Wait for body to arrive
}

jimmywarting avatar Jan 16 '22 21:01 jimmywarting

related to: https://github.com/node-fetch/node-fetch/issues/1446

dnalborczyk avatar Jan 19 '22 18:01 dnalborczyk

Probably fixed by #1474.

felipec avatar Jan 24 '22 00:01 felipec