graphql-php
graphql-php copied to clipboard
Rate-Limit: Possibility to retrieve the QueryComplexity
Hello together,
there are requirements to build a rate-limit system based on the query complexity & not based on request-hits. This blog-post describes this behaivor: https://shopify.engineering/rate-limiting-graphql-apis-calculating-query-complexity To make this happen we need to calculate the QueryComplexity before we execute the query.
This can be done by executing the QueryComplexity rule before the "executeQuery" method. In the past there was also a getter to retrieve the calculated QueryComplexity on the validator.
Bonus: The used query complexity after the execution can be lower then the requested QueryComplexity. For example you request a list of 10 but the system have only 5. In a perfect world we would charge back the 5 to much requested nodes. While such a sub-feature would be nice, it is for sure no requirement for a first change.
Introducing of QueryComplexity getter: https://github.com/webonyx/graphql-php/pull/531 Issue got created based on following discussion: https://github.com/webonyx/graphql-php/discussions/1025
Thanks!
I am not sure what this issue is asking. What needs to change in the library? Maybe you can just create a pull request.
The simplest would be to bring back the functionallity from https://github.com/webonyx/graphql-php/pull/531 . That means a getter or a public property inside the QueryComplexity.php class to retreive the complexity.
Maybe you can just create a pull request.
I can try this.
@kinimodmeyer Out of curiosity: how did you resolve the issue of multiple operations? I mean, if you send more than one, you have to provide operationName as well – but the QueryComplexity validator does not account for this, and runs on all of them; then the getter will return the complexity of the last operation, not the one that was actually executed.
I am asking, because my team had the same issue, and we are also implementing a complexity-based rate limiting. We decided to build our own query complexity validator, with the possibility to restrict validation to just one operation name.
@jacor84 you could remove the not needed operation from the schema (also have a look a the fix for 785):
$parser = Parser::parse($input['query']);
$newDefinitions = [];
foreach ($parser->definitions as $def) {
if (isset($input['operationName']) && $def->kind === NodeKind::OPERATION_DEFINITION
&& $def->name->value !== $input['operationName']) {
continue;
}
// fix for https://github.com/webonyx/graphql-php/issues/785
if ($def->kind === NodeKind::FRAGMENT_DEFINITION) {
array_unshift($newDefinitions, $def);
} else {
array_push($newDefinitions, $def);
}
}
$parser->definitions = $newDefinitions;
$rule = new QueryComplexity($maxQueryComplexity);
$parseErrors = DocumentValidator::validate(
...,
$parser,
array_merge(DocumentValidator::defaultRules(), [$rule])
);
Thanks, @kinimodmeyer! Indeed, this would be one way of achieving this, and we considered such approach, but:
- It wouldn't work with the
StandardServer, which, I assume, is the default way of using this library. - It could fail other validators, like
NoUnusedFragmentsorNoUnusedVariables, which are useful and included inDocumentValidator::defaultRules().
I will close the issue now, i also belive that it is better to create a own custom validator to handle the rate limit instead of modifying the existing query complexity validator.
Maybe in future the query complexity logic inside the query complexity validator can be further abstracted to make it easier to resuse it in your own validators.