gzip
gzip copied to clipboard
Gzip will compress the compressed body again
Gzip will compress the compressed body again when I use third-party handlers to handle gin context,
like httputil.ReverseProxy
, promhttp
.
It seems if the target HTTP server returns a gzip-compressed body, the middleware will compress it again, which could result in the client can't decompress the response correctly.
For example, Prometheus can't scrape the target response correctly or some responses will be garbled, due to the double compression.
Though I can solve this problem by setting the gzip.WithExcludedPaths
or other ExcludedOptions, I don't think it is the best way. Because sometimes we want to solve this problem more wisely and elegantly, rather than coding the ExcludedOptions hardly.
I also tried to fix this problem by myself. And past all default unit tests.
Am I welcomed to make a pr?
I had the same problem,Can we do like this ?
func (g *gzipHandler) shouldCompress(req *http.Request) bool {
if !strings.Contains(req.Header.Get("Accept-Encoding"), "gzip") ||
strings.Contains(req.Header.Get("Connection"), "Upgrade") ||
strings.Contains(req.Header.Get("Accept"), "text/event-stream") {
return false
}
extension := filepath.Ext(req.URL.Path)
if g.ExcludedExtensions.Contains(extension) {
return false
}
if g.ExcludedPaths.Contains(req.URL.Path) {
return false
}
if g.ExcludedPathesRegexs.Contains(req.URL.Path) {
return false
}
c.Request.Header.Del("Accept-Encoding") // remove the header key, then other middlewares don't compress
return true
}
@SinuxLee I don't think it's a good idea. I think we should skip the compress when the body is gzip-compressed, rather than change the request header. Because in my opinion, HTTP messages should be in-sensitive to other middlewares.
@LawyZheng 👍 Looking forward to your PR. Maybe, I can help to review the code :)
Any update on this issue?
Any update on this issue?
waiting for PR merge