Throttling Service/Rate Limit
Feature Request
What problem are you trying to solve?
It's unrare that services are requested to be protected by throttling mechanism that could be defined by the following settings:
- Max amount of open connections (from all clients)
- Max amount of open connections by client (service name)
- Max amount of requests per time period (from all clients)
- Max amount of requests per time period by client (service name)
- Response code/message to override the default value (code: 429 message: the quota X was fulfilled)
Client Types:
- Service (identified by ServiceProfiles)
- User (identified by JWT/header)
How should the problem be solved?
The best option is to integrate this feature with Server-side policy. Once the rate limit policy is defined, the proxy will enforce the policy's rules
Any alternatives you've considered?
Integration with an external service, similarly Envoy's implementation
How would users interact with this feature?
apiVersion: v1
kind: ServerSidePolicyRateLimit
metadata:
name: my-service
namespace: prod
spec:
rules:
- name: globalthreshold
type: global-open-connections
value: 5000
- name: requests-per-minute-service
type: requests-limit
period: 60s
value: 1000
type: service
selector: other-service
- name: requests-per-minute-user
type: requests-limit
period: 60s
value: 100
type: user
selector: JWT::appid
This has been labelled as design on June 2021, is there any work in progress for this awesome feature? 🤟🏽
While work on this specific feature hasn't moved forward, we do have some work in flight that will be foundational to implementing a feature like this. The new policy CRDs use the policy attachment pattern to configure server-side policy. Rate limiting resources will fit this same pattern, binding onto Server or route resources. We're currently focused on shipping updated access policies that can be attached to an HTTP route instead of just a Server; but I'd expect other types of policy--e.g., RequestRateLimitPolicy--to follow AuthorizationPolicy
Any progress here? It's 2024 now.
Lots of moving parts over the last couple of years, but we're hoping to be able to address this before it stops being 2024. 🤞