graphql-engine
graphql-engine copied to clipboard
server: `limit` size restrictions
query MyQuery {
test(limit: 2147483648) {
id
}
}
yields:
{
"errors": [
{
"extensions": {
"path": "$.selectionSet.test.args.limit",
"code": "parse-failed"
},
"message": "The value 2.147483648e9 lies outside the bounds or is not an integer. Maybe it is a float, or is there integer overflow?"
}
]
}
Note that the id field is of type bigint here, although the limit field is of type int. So, it seems that graphql-engine does not support large LIMITs. In fact, I was a little bit worried that the function pgColValueToInt in server/src-lib/Hasura/SQL/Value.hs would cast arbitrary integers to a Haskell Int, thus ending up with incorrect LIMITs. However, this does not seem to be the case.
In any case, we should allow bigger limits, because Postgres has no such size restriction. It probably suffices to store limits as Integer internally. It's possible that similar restrictions currently apply to OFFSET.
Can we get clarification on the bounds for integer values? Larger integers that are valid in both JS and postgres result in the error below because hasura interprets them as scientific notation.
I'm currently using text fields as a workaround but that's not ideal.
{
"errors": [
{
"extensions": {
"path": "$.variableValues.size",
"code": "parse-failed"
},
"message": "The value 2.592e9 lies outside the bounds or is not an integer. Maybe it is a float, or is there integer overflow?"
}
]
}
@hthillman Does the table here help? https://hasura.io/docs/1.0/graphql/core/api-reference/postgresql-types.html#introduction
@tirumaraiselvan
I interpret that as "Hasura Int is actually an alias for postgres Int4". For fields that could potentially overflow an Int4, we should be explicitly using hasura bigint / postgres int8, is that accurate?
@hthillman That's right! Although "Hasura Int" is standard Graphql scalar Int: https://spec.graphql.org/June2018/#sec-Int
The current issue is for supporting limit's type as bigint (but that seems like a breaking change, unfortunately)
just change datatype int to bigInt its solve my same problem