graphql-engine
graphql-engine copied to clipboard
Large Integers in JSONB (postgres) rounding
Version Information
v2.37.0
OSS
What is the current behaviour?
Jsonb columns in postgres with big int values are being rounded. eg 750727362669608961
becomes 750727362669609000
on insert or select
What is the expected behaviour?
Hasura to just marshal the jsonb field back untouched.
How to reproduce the issue?
Note: I was troubleshooting over https://discord.com/channels/407792526867693568/1204713155125710929 and the user who reported this initially is doing it differently and will report how they are doing it once I post this
- Add a jsonb column to a postgres db
- insert a very large integer key/value pair via the
console/data tab
->{"creatorID": 750727362669608961}
- select it and get 750727362669609000
Screenshots or Screencast
Schema from console
post insert via console/data/browse
Gql to pull same record
query MyQuery {
jo_test(where: {id: {_eq: 1}}) {
id
events
}
}
results of gql
{
"data": {
"jo_test": [
{
"id": 1,
"events": {
"creatorID": 750727362669609000
}
}
]
}
}
Now, an interesting thing, when I do this via a mutation everything works as intended...
mutation and response
mutation MyMutation {
insert_jo_test_one(object: {events: "{\"creatorID\": 750727362669608961}"}) {
id
events
}
}
{
"data": {
"insert_jo_test_one": {
"id": 2,
"events": "{\"creatorID\": 750727362669608961}"
}
}
}
select and response for record 2
query MyQuery {
jo_test(where: {id: {_eq: 2}}) {
id
events
}
}
{
"data": {
"jo_test": [
{
"id": 2,
"events": "{\"creatorID\": 750727362669608961}"
}
]
}
}
Keywords
jsonb postgresql rouding truncation json parser
After discussing with Tristen we seem to think it's related to the json parser
Hey I am OP here thanks for filing.
I have used golang/pgx to insert my data. In golang this field was a uint64.
Hey folks, thanks for reporting. The underlying issue here is that most json implementations implement numbers as IEE-754, and as such big integers that go beyond the supported range can be truncated at various occasions, such as browsers parsing json:
Screencast from 2024-02-14 16-54-16.webm
Specifically what's going on here is that the browser encodes and decodes these big int values outside of graphql engine.
So what are issues that can be used to solve this? Would it be possible to have an option in engine to transform jsonb so that numbers are converted as strings?
Would it be possible to have this fixed in cloud.hasura graphiql?
I got such mixed results with my experiments. Manual insertion via Dbeaver worked fine, insertion via a mutation worked fine, it was only when I used the data tab to insert the row that this happened. @bjartek is using a data structure that should be able to handle that many digits, but something, somewhere is getting lost in the sauce.
While it's theoretically possible to modify the Hasura console such that large numbers are propagated unchanged, doing so would not eliminate surprising behaviour because any producer or consumer of the JSON would be entitled to truncate numeric JSON columns.
The recommendation in RFC 8259 on good interoperability is to expect no more precision or range than double precision IEEE754.
For wide numeric types at database column level, Hasura provides the stringify-numeric-types option. Of course, this option does not exist within JSONB columns because Hasura has no way to infer a schema from arbitrary JSON.
We would recommend that you change the format of your provided JSON to use a string representation of the large number. This will ensure that your representation of numeric data will be properly portable, respected by all potential producers and consumers.