grpc-web
grpc-web copied to clipboard
Response parsing breaks with 'grpc-encoding: gzip'
When the gRPC response is gzipped (indicated by the grpc-encoding: gzip
response header), the grpc-web stream parser fails with this error (including the first few bytes for reference):
Uncaught Error: The stream is broken @0/0. Error: invalid frame byte. With input:
1,0,0,2,126,31,...
And the grpc-web filter in Envoy unfortunately automatically adds this request header when proxying to the gRPC server: grpc-accept-encoding:identity,deflate,gzip
, which in the case of akka-grpc at least causes the server to gzip. The envoy source code has a comment stating that this header 'is required for gRPC'.
When I hacked akka-grpc to ignore the above header and not gzip, the response is properly parsed.
I'm a backend engr., but if someone can give me some pointers I could take a stab at fixing this.
I have the same issue ... I had to hack the akka server to override grpc-accept-enconding so the server doesn't gzip the content
According to this, seems like such compression is redundant.
With envoy you can use request_headers_to_remove
(doc) or this to completely remove or modify this header.
@seroperson thanks for the pointers. Per this the grpc web client implements the grpc-web protocol spec described in that document, which says nothing about not using the grpc-accept-encoding
and grpc-encoding
headers. Is that an oversight in the spec that should be fixed? Otherwise Envoy is following the spec and the client library isn't and should be fixed.
This cost me some hours. Here is a solution, a bit of code to get others started that run into this issue with Akka HTTP:
val service = NetworkWorkersHandler(new NetworkWorkersService())
val handler: HttpRequest => Future[HttpResponse] = { request =>
val withoutEncoding = request.copy(headers = request.headers.filterNot(_.name == "grpc-accept-encoding"))
service(withoutEncoding)
}
val binding = Http().bindAndHandleAsync(
handler,
interface = "127.0.0.1",
port = 8000,
connectionContext = HttpConnectionContext(),
)
I am getting this exact same problem with a python server when I use a bytes
Type in my proto. Is that internally being streamed because my proto is unary only? How do I fix?
How do I fix?
A workaround is to remove the grpc-accept-encoding
header in the akka-received HTTP request prior to passing it to the generated service. An example of how such a wrapper for a gRPC service can look is my post above.
How do I fix?
A workaround is to remove the
grpc-accept-encoding
header in the akka-received HTTP request prior to passing it to the generated service. An example of how such a wrapper for a gRPC service can look is my post above.
But I am using python grpc server with envoy. And not akka.
Oh sorry, I remembered this as an akka-related issue and didn't see the repo in the notification. I guess the solution may be similar, the implementation different.
now my grpc serice server set context->set_compression_algorithm(GRPC_COMPRESS_GZIP); I got the same exception: Uncaught Error: The stream is broken @0/0. How to solve this problem? I cannot change the grpc service server. how to revise envoy?
@akan , @seroperson's comment above explains how.
@stanley-cheung I came here from https://github.com/grpc/grpc-web/issues/713, and was looking to enable the envoy.gzip
HttpFilter on our Envoy service, but I'm worried about our grpc-web
client not being able to handle compressed responses coming from the Envoy service. https://www.envoyproxy.io/docs/envoy/v1.15.0/configuration/http/http_filters/gzip_filter
What's the status of this issue and its resolution?
@stanley-cheung We are also facing similar issue with gzip compression not supported by grpc-web. Issue #1000 The protobuf payload is larger than the gzipped JSON payload and this adds to the increased response time. We are not able to move to grpc-web currently for our JS client to service calls due to this issue.
Any plans you could share on this gzip support being added in the near future? Or any other suggestions in terms of other compression that grpc-web supports or ways to mitigate the issue?
@stanley-cheung Is there a proposed solution for compressing the response payloads? This is critical for our application running via grpc-web. Adding the envoy.gzip plugin doesn't seem to help according to: https://github.com/envoyproxy/envoy/issues/6632
success: fi, err := os.Open(req.FileName) if err != nil { log.Fatalf("ReadFile %s err: %v", req.FileName, err) } defer fi.Close()
for { buf := make([]byte, 4096) //每次需要再for循环里面重新make,踩坑 n, err := fi.Read(buf) if err != nil && err != io.EOF { log.Fatalf("ReadFile %s err: %v", req.FileName, err) } if 0 == n { break } err = stream.Send(&pb.DownLoadResp{ FileChunk: buf[:n], }) if err != nil { log.Fatalf("send %s err: %v", req.FileName, err) } }
invalid byte frame or broken is byte The cut size is incorrect
I've encountered the same issue, using a Python gRPC server and gRPC-Web as the client, envoy as the proxy. Is there any new progress on this? Any help would be greatly appreciated. @stanley-cheung
@ykn1106 https://github.com/grpc/grpc-dart/issues/668