RequestError: unexpected end of file
Describe the bug
- Node.js version: 18.15.0
- OS & version: Linux Mint 21.1 Cinnamon 64-bit
Actual behavior
node:internal/process/esm_loader:97
internalBinding('errors').triggerUncaughtException(
^
RequestError: unexpected end of file
at PassThrough.<anonymous> (file:///home/volfmatej/temp/got-repro/node_modules/got/dist/source/core/index.js:599:31)
at Object.onceWrapper (node:events:628:26)
at PassThrough.emit (node:events:525:35)
at emitErrorNT (node:internal/streams/destroy:151:8)
at emitErrorCloseNT (node:internal/streams/destroy:116:3)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21)
at BrotliDecoder.zlibOnError [as onerror] (node:zlib:189:17)
Expected behavior
Request succeeds and body is printed.
Code to reproduce
import {got} from "got"; // got 12.6.0
const resp = await got("https://www.yelp.com/search/snippet?find_desc=Hotels&request_origin=user&l=g%3A-2.2323167%2C53.476563899999995%2C-2.2373167%2C53.4715639");
console.log(resp.body);
Checklist
- [ ] I have read the documentation. I'll be honest, I didn't, but I'm strongly convinced this should work, given it works for www.example.com
- [x] I have tried my code with the latest version of Node.js and Got.
Hm, after more investigation it seems like the root of the issue is a corrupted (or some other issue with it) brotli compression. Both Firefox and Chromium on my machine handle it correctly in full, curl --compressed returns only part of the response, and got failed as I report above. Attached is a zip archive (because of github file limitations) which contains the damaged brotli response, you can run a test server for example with deno (but you can spin up basically any simple http server and add headers as below)
deno run -A server.js. server.js contents:
const data = await Deno.readFile("issue.brotli");
console.log(data);
async function handleHttp(conn) {
for await (const e of Deno.serveHttp(conn)) {
e.respondWith(
new Response(data, {
headers: {
"content-encoding": "br",
"content-type": "application/json",
},
})
);
}
}
for await (const conn of Deno.listen({ port: 8000 })) {
handleHttp(conn);
}
Attached: issue.zip
I'll leave it up to you whether this is something you want to fix/improve (I suppose it would be in the decompress-stream package, is it possible to easily move this issue there?), or close as wontfix
I got an email notification about an issue comment from @WillianAgostini who noted that this works with axios. The comment is now deleted, not sure why, because it really does work with axios.
For convenience, I host the broken file at ftp.mvolfik.com/broken.brotli.json. It serves the broken brotli version with content-encoding: br regardless of request headers.
Yes, I've deleted it because I was looking for why it works on axios.
Here is the implementation where is created a stream to download all data, so on the end, all data is concatenated and converted to json. https://github.com/axios/axios/blob/6f360a2531d8d70363fd9becef6a45a323f170e2/lib/adapters/http.js#L506-L558
Based on implementation in axios and the docs https://github.com/sindresorhus/got/blob/main/documentation/3-streams.md, here is a minimal example to resolve this issue.
import got from "got";
const url = "https://www.yelp.com/search/snippet?find_desc=Hotels&request_origin=user&l=g%3A-2.2323167%2C53.476563899999995%2C-2.2373167%2C53.4715639";
const stream = got.stream(url, {
decompress: false,
});
const chunks = [];
stream.on('data', (chunk) => {
chunks.push(chunk);
});
stream.once('end', () => {
const buffer = Buffer.concat(chunks);
console.log(JSON.parse(buffer.toString()));
});
Well the decompress just disables sending of the compression headers, then you don't need to do anything fancy with streams:
import got from "got";
const url = "https://www.yelp.com/search/snippet?find_desc=Hotels&request_origin=user&l=g%3A-2.2323167%2C53.476563899999995%2C-2.2373167%2C53.4715639";
console.log(await got(url, {
decompress: false,
}));
I am aware of this workaround, I just wanted to point out the issue with the broken brotli stream, but as I said in my second comment, I'll leave it up to you whether this is something you want to fix, or close as wontfix because site serves invalid data.