Rate Limiting Fails with Multiple Distinct Keys and Order-Sensitive Limits
Laravel Version
11
PHP Version
8.1.4
Database Driver & Version
No response
Description
The current ThrottleRequests middleware incorrectly handles rate limits when:
- Multiple distinct keys (e.g.,
user_id+product_id) are used in the same limiter. - Order of limits in the array affects functionality (e.g., per-day limit checked before per-minute).
Steps To Reproduce
Issue 1: Over-Throttling with Multiple Distinct Keys
- Define a limiter:
RateLimiter::for('update-product', function (Request $request) {
return [
Limit::perDay(1)->by('user:' . $request->user()->id),
Limit::perDay(1)->by('product:' . $request->product->id)
];
});
- Scenario:
- User 1 updates Product 1 → 200 OK.
- User 2 updates Product 1 → 429 (Correct: product limit).
- User 2 tries to update Product 2 → 429 (Wrong: should be allowed!).
- Expected:
- Throttling should only apply to the violated key (product_id).
Issue 2: Order-Sensitive Limits in Documentation
-
Description: The documentation example shows limits ordered as perMinute then perDay, but reversing the order breaks functionality.
-
Broken Example:
RateLimiter::for('uploads', function (Request $request) {
return [
Limit::perDay(1000)->by($request->user()->id), // Long-term first
Limit::perMinute(10)->by($request->user()->id), // Short-term second
];
});
- Scenario:
- Send 1000 requests in 1 minute.
- Expected:
- After a minute, the user can upload but cannot.
Proposed Solution
Modify src/Illuminate/Routing/Middleware/ThrottleRequests.php to:
- Check all limits first, then apply
hit().
Code Fix:
protected function handleRequest($request, Closure $next, array $limits) {
// Check all limits first
foreach ($limits as $limit) {
if ($this->tooManyAttempts($limit->key, $limit->maxAttempts)) {
throw $this->buildException(...);
}
}
// Apply hits if all checks pass
foreach ($limits as $limit) {
$this->limiter->hit($limit->key, $limit->decaySeconds);
}
return $next($request);
}
Suggested Labels: bug, rate limiting
To your first issue, you define a limit per day per user, so I think it is right, that he is not able to update product 2. If you want to achieve on update per product and user, you should use one key like update:{userId}:{productId}.
I think for the second issue: it would be best to use distinctive keys like uploads:daily:{userId} and uploads.minute:{userId}. I think there are certain limitations how the laravel framework can determine what you exactly want to do. So I think it is best to use distinctive keys.
Issue 1: Multiple Distinct Keys vs Composite Keys
Your Suggestion:
Using a composite key like update:{userId}:{productId} would solve the problem.
Why This Doesn’t Fully Address the Issue:
Business Requirement:
- We need two independent limits:
- User Limit: 1 update per user per day (regardless of the product).
- Product Limit: 1 update per product per day (regardless of the user).
- A composite key would only enforce "1 update per user for a specific product," which violates the requirement.
Your Suggestion Code:
// Composite Key Approach
Limit::perDay(1)->by("user:{$user->id}:product:{$product->id}");
Result:
- User A can update Product 1 → Success.
- User A can still update Product 2 → Success (violates "1 update per user daily").
- User B can update Product 1 → Success (violates "1 update per product daily").
Conclusion:
- Composite keys work for combined limits but fail for independent per-user and per-product rules.
Issue 2: Order Sensitivity & Distinctive Keys
Your Suggestion:
- Use distinctive keys like
uploads:daily:{userId}anduploads:minute:{userId}.
Why This Doesn’t Fix the Core Problem:
RateLimiter::for('test', function (Request $request) {
return [
Limit::perDay(1000)->by("daily:{$user->id}"), // Checked first
Limit::perMinute(10)->by("minute:{$user->id}"), // Checked second
];
});
If limits are ordered as [daily, minute], the daily limit is checked first. and After 1000 requests in a minute, the user is throttled for 24 hours instead of 1 minute.
Root Cause:
The hit() method is applied to all keys even if one limit is violated.
Limits are processed in array order, not priority.
My solution:
protected function handleRequest($request, Closure $next, array $limits) {
// 1. Check all limits first
foreach ($limits as $limit) {
if ($this->tooManyAttempts(...)) {
throw $exception;
}
}
// 2. Apply hits only if all limits pass
foreach ($limits as $limit) {
$this->hit(...);
}
return $next($request);
}
Advantages:
- Independent Limits:
-
User and product limits work separately (no overlapping hit() calls).
-
Example:
-
User A → Product 1: Blocked after 1 update (product limit).
-
User A → Product 2: Allowed (user limit not reached).
-
- Order Independence:
- Short-term limits (e.g., per-minute) are enforced even if defined after long-term limits.
Performance Trade-Off: Yes, two loops add minimal overhead, but this is necessary to decouple limit checks from hits.
It took me a while to grasp the issue, but after thinking about it, I guess you are correct, and the desired behavior should be checking all limits for Too Many Attempts first, and if no exceptions are thrown, hit them later.
I would send a PR, to the master branch, as in some sense it breaks the current behavior and some people might be relying on the current behavior, or using bespoke workarounds.
Be sure to explain thoroughly the issue in the PR, and try adding tests to increase its chances to be merged.
I agree that this is an issue. May want to also sort the limit array by descending seconds in order to serve the headers for the limit with the longest duration.