kong icon indicating copy to clipboard operation
kong copied to clipboard

Proxy-cache plugin didn't work on gRPC route

Open fanny2017 opened this issue 3 years ago • 4 comments

There is a Greeter service implemented by python gRPC, then I configured service and routes on Kong gateway for it, and also added proxy cache plugin on route level.

  1. First request returned as: grpcurl -vv -d '{"name":"Kong"}' -plaintext :8443 greeter.Greeter/SayHello

Resolved method descriptor: rpc SayHello ( .greeter.HelloRequest ) returns ( .greeter.HelloReply );

Request metadata to send: (empty)

Response headers received: accept-encoding: identity,gzip content-type: application/grpc date: Thu, 16 Jun 2022 10:02:45 GMT grpc-accept-encoding: identity,deflate,gzip server: openresty via: kong/2.8.1.1-enterprise-edition x-cache-key: a3c46e9006cafe2f534510af3620c347 x-cache-status: Miss x-kong-proxy-latency: 1 x-kong-upstream-latency: 2

Estimated response size: 14 bytes

Response contents: { "message": "Hello, Kong!" }

Response trailers received: (empty) Sent 1 request and received 1 response

log from server side: - - [16/Jun/2022:10:02:45 +0000] "POST /greeter.Greeter/SayHello HTTP/2.0" 200 19 "-" "grpcurl/1.8.5 grpc-go/1.37.0" - - [16/Jun/2022:10:02:45 +0000] "POST /grpc.reflection.v1alpha.ServerReflection/ServerReflectionInfo HTTP/2.0" 200 1046 "-" "grpcurl/1.8.5 grpc-go/1.37.0"

  1. Second request: grpcurl -vv -d '{"name":"Kong"}' -plaintext :8443 greeter.Greeter/SayHello

Resolved method descriptor: rpc SayHello ( .greeter.HelloRequest ) returns ( .greeter.HelloReply );

Request metadata to send: (empty)

Response headers received: accept-encoding: identity,gzip age: 14 content-length: 19 content-type: application/grpc date: Thu, 16 Jun 2022 10:02:47 GMT grpc-accept-encoding: identity,deflate,gzip server: openresty via: kong/2.8.1.1-enterprise-edition x-cache-key: a3c46e9006cafe2f534510af3620c347 x-cache-status: Hit x-kong-proxy-latency: 1 x-kong-upstream-latency: 0

Response trailers received: (empty) Sent 1 request and received 0 responses ERROR: Code: Internal Message: server closed the stream without sending trailers

log from server side: - - [16/Jun/2022:10:02:47 +0000] "POST /greeter.Greeter/SayHello HTTP/2.0" 200 19 "-" "grpcurl/1.8.5 grpc-go/1.37.0" - - [16/Jun/2022:10:02:47 +0000] "POST /grpc.reflection.v1alpha.ServerReflection/ServerReflectionInfo HTTP/2.0" 200 1046 "-" "grpcurl/1.8.5 grpc-go/1.37.0"

From response and log, cache was Hit, http returned code 200, content length is 19, but no body content returned. I'm not sure what happened.

Does proxy-cache plugin support gRPC service?

fanny2017 avatar Jun 16 '22 10:06 fanny2017

any update on this? we’re also stumble on the same issue, we try using proxy-cache for gRPC service, with POST method.

the result is HIT

Screen Shot 2022-10-21 at 14 57 48

But we didn't get the response body, we got this following response instead

curl: (92) HTTP/2 stream 0 was not closed cleanly:

hendras8 avatar Oct 21 '22 08:10 hendras8

Internal tracking ID (for maintainers reference): KAG-846.

gszr avatar Mar 10 '23 17:03 gszr

This is because HTTP2 requests cannot fit the proxy-cache plugin scenario. In the case of an HTTP2 stream, enabling the proxy-cache plugin (like GRPC) may block the entire HTTP2 request processing because Kong or Openresty cannot read the whole request body in any phase of the HTTP2 connection. This will cause Nginx to block the special Lua coroutine and prevent it from executing another phase.

https://github.com/openresty/lua-nginx-module/issues/2172

oowl avatar Mar 28 '23 07:03 oowl

GRPC is stream base protocol, and I did not understand this scenario, But I have conducted a thorough investigation into Nginx and Openresty code, this implementation may not be handled by Nginx's existing model because the gRPC HTTP2 stream data frame is not the end of the stream.

oowl avatar Mar 28 '23 08:03 oowl

Dear contributor, We're closing this issue as there hasn't been any update to it for a long time. If the issue is still relevant in the latest version, please feel free to reopen it. We're more than happy to revisit it again. Your contribution is greatly appreciated! Please have a look at our pledge to the community for more information. Sincerely, Kong Gateway Team

StarlightIbuki avatar Oct 11 '23 06:10 StarlightIbuki