remix icon indicating copy to clipboard operation
remix copied to clipboard

[fetch-proxy] Compressed responses retain original headers after decompression

Open phiggins opened this issue 5 months ago • 1 comments

Hi :wave:,

I tried to describe this issue in the previous repo but it was never addressed. The issue as far as I understand it: if a request is proxied to a server that returns an encoded response, the response from the proxy retains the original Content-Encoding and Content-Length headers but decompresses the body. This problem is described better than I can in this undici issue.

As an example, if I have this server running:

import http from 'node:http';
import { createFetchProxy } from '@remix-run/fetch-proxy'
import { createRequestListener } from '@remix-run/node-fetch-server'

const proxy = createFetchProxy('http://example.org/')

async function handler(request: Request) {
  console.log(`proxying ${request.url}`);
  const res = await proxy(request);
  console.log({
    "Content-Encoding": res.headers.get("Content-Encoding"),
    "Content-Length": res.headers.get("Content-Length"),
    "Body length": (await res.clone().text()).length,
  });
  return res;
}

let server = http.createServer(createRequestListener(handler))

server.listen(3000, () => {
  console.log('Server running at http://localhost:3000')
})

This is the output from curl:

phiggins@dust:~/projects/fetch-proxy-content-encoding-bug$ curl -v http://localhost:3000
* Host localhost:3000 was resolved.
* IPv6: ::1
* IPv4: 127.0.0.1
*   Trying [::1]:3000...
* Connected to localhost (::1) port 3000
> GET / HTTP/1.1
> Host: localhost:3000
> User-Agent: curl/8.5.0
> Accept: */*
>
< HTTP/1.1 200 OK
< accept-ranges: bytes
< cache-control: max-age=86000
< connection: keep-alive
< content-encoding: gzip
< content-length: 648
< content-type: text/html
< date: Tue, 23 Sep 2025 21:28:26 GMT
< etag: "84238dfc8092e5d9c0dac8ef93371a07:1736799080.121134"
< last-modified: Mon, 13 Jan 2025 20:11:20 GMT
< vary: Accept-Encoding
<
<!doctype html>
<html>
<head>
    <title>Example Domain</title>

    <meta charset="utf-8" />
    <meta http-equiv="Content-type" content="text/html; charset=utf-8" />
    <meta name="viewport" content="width=device-width, initial-scale=1" />
    <style type="text/css">
    body {
        background-color: #f0f0f2;
        margin: 0;
        padding: 0;
        font-family: -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", "Open Sans", "Helvetica Neue", Helvetica, Arial, sans-serif;

    }
    div {
        width: 600px;
        margin: 5em auto;
        padding: 2em;
        background-color: #fdfdff;
* Excess found writing body: excess = 608, size = 648, maxdownload = 648, bytecount = 648
* Closing connection
        border-radius:phiggins@dust:~/projects/fetch-proxy-content-encoding-bug$

This is the server's console output:

phiggins@dust:~/projects/fetch-proxy-content-encoding-bug$ node --disable-warning=ExperimentalWarning --experimental-strip-types test.ts
Server running at http://localhost:3000
proxying http://localhost:3000/
{
  'Content-Encoding': 'gzip',
  'Content-Length': '648',
  'Body length': 1256
}

If I access the server in a browser I see this:

Image

In the react-router 7 app where I originally encountered this error, deleting the Content-Encoding and Content-Length headers from the proxied response fixed the issue but there is potentially a better way to fix this.

phiggins avatar Sep 23 '25 21:09 phiggins

I've run into this exact issue before I dont know what a good solution is:

  • Have the proxy re-compress the response, so that the headers are correct?
  • Remove them entirely?
  • Add an option to fetch to not decompress the response (but we really shouldn't be patching globals)

Would be great to get the team's point of view

itxch avatar Oct 11 '25 01:10 itxch