fetch icon indicating copy to clipboard operation
fetch copied to clipboard

[node v20] Fetch with http2 hangs when not reading the response body

Open jdelbick opened this issue 9 months ago • 1 comments

Description For http2, the stream doesn't get closed if the body is not read. Not reading the body is common sometimes for error cases in nodejs and this error doesn't happen if you use HTTP1 or other node fetch implementations like node-fetch. Leaving connections open can result in hanging in tests or extreme cases if the service calls a lot of HTTP requests that fail, it could result in

ERROR	(node:8) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 1001 error listeners added to [ClientHttp2Session]. Use emitter.setMaxListeners() to increase limit
(Use `node --trace-warnings ...` to show where the warning was created)

My suggestion is to have some warning about this and suggest to always read the body in http2.

To Reproduce This script displays hanging when running locally with any value of times >1. (node v20)

import { fetch } from '@adobe/fetch';
async function fetchUrl(url) {
    const response = await fetch(url);
    if (!response.ok) {
        throw new Error(`Failed to fetch ${url}: ${response.status} ${response.statusText}`);
    }
    const data = await response.text();
    return data;
}

async function fetchMultipleTimesSync(url, times) {
    for (let i = 1; i <= times; i++) {
        try {
            await fetchUrl(url);
        }
        catch (error) {
            console.error(error);
        }
    }
}

const url = 'https://the-internet.herokuapp.com/status_codes/500';
const times = 2;
await fetchMultipleTimesSync(url, times)

node-fetch doesn't hang in this case:

import fetch from 'node-fetch';
async function fetchUrl(url) {
    const response = await fetch(url);
    if (!response.ok) {
        throw new Error(`Failed to fetch ${url}: ${response.status} ${response.statusText}`);
    }
    const data = await response.text();
    return data;
}
// same code as above ...

However if you just switch the body read to before the error check, it works fine

import { fetch } from '@adobe/fetch';
async function fetchUrl(url) {
    const response = await fetch(url);
    const data = await response.text();
    if (!response.ok) {
        throw new Error(`Failed to fetch ${url}: ${response.status} ${response.statusText}`);
    }
    // const data = await response.text();
    return data;
}

async function fetchMultipleTimesSync(url, times) {
    for (let i = 1; i <= times; i++) {
        try {
            await fetchUrl(url);
        }
        catch (error) {
            console.error(error);
        }
    }
}

// const url = 'https://example.com';
const url = 'https://the-internet.herokuapp.com/status_codes/500';
const times = 2;
await fetchMultipleTimesSync(url, times)

Expected behavior adobe/fetch docs should call this out, along the lines of “you normally don’t have to do this with other fetch implementations, but in our case cleanup only happens after the response body is consumed. if not, leaks can happen. make sure to always consume the response in all cases (error or not) with await response.body() [or response.text()]”

Version: v4.1.2

jdelbick avatar May 03 '24 17:05 jdelbick

I'm not using this library anymore, but pr's are welcome

benawad avatar Jul 26 '19 01:07 benawad