pglite
pglite copied to clipboard
error: operator does not exist: jsonb @> json
Raw query with @> works just fine:
> await db.query(`select * from test where data @> '{"hello": "world"}'`);
{
rows: [ { data: { hello: "world" } } ],
fields: [ { name: "data", dataTypeID: 3802 } ],
affectedRows: 0
}
Same query with parameters, the parameter is converted to json type instead of jsonb, causing it to fail:
> await db.query(`select * from test where data @> $1`, [{"hello": "world"}]);
Uncaught error: operator does not exist: jsonb @> json
at V.parseErrorMessage (file:///home/alex/.cache/deno/npm/registry.npmjs.org/@electric-sql/pglite/0.2.0/dist/index.js:1:48544)
at V.handlePacket (file:///home/alex/.cache/deno/npm/registry.npmjs.org/@electric-sql/pglite/0.2.0/dist/index.js:1:45456)
at V.parse (file:///home/alex/.cache/deno/npm/registry.npmjs.org/@electric-sql/pglite/0.2.0/dist/index.js:1:44194)
at Mt.execProtocol (file:///home/alex/.cache/deno/npm/registry.npmjs.org/@electric-sql/pglite/0.2.0/dist/index.js:1:51142)
at eventLoopTick (ext:core/01_core.js:168:7)
at async Mt.$ (file:///home/alex/.cache/deno/npm/registry.npmjs.org/@electric-sql/pglite/0.2.0/dist/index.js:1:56205)
at async file:///home/alex/.cache/deno/npm/registry.npmjs.org/@electric-sql/pglite/0.2.0/dist/index.js:1:55069
Here is the setup:
await db.query('create table test (data jsonb)');
await db.query(`insert into test (data) values ('{"hello": "world"}')`);
Offending line here: https://github.com/electric-sql/pglite/blob/e522c118b2e304879c99927d07b0e83337db32e5/packages/pglite/src/types.ts#L122
we should double check the behaviour of over js clients, but they may predate JSONB and so I'm inclined to change.
we should double check the behaviour of over js clients
It looks like they just stringify it?
https://github.com/denodrivers/postgres/blob/main/query/encode.ts#L101
Can confirm that wrapping my parameter in JSON.stringify first works perfectly:
> await db.query(`select * from test where data @> $1`, [JSON.stringify({"hello": "world"})]);
{
rows: [ { data: { hello: "world" } } ],
fields: [ { name: "data", dataTypeID: 3802 } ],
affectedRows: 0
}
I won't pretend I know how a Postgres client actually works, but it seems to make sense that by stringifying it we "just let the database handle it", so we don't have to try to detect whether it needs to be json or jsonb in that context.
Not sure if this is related, but I got caught out by a string in a JSON field (that happened to look like JSON) coming out of this client as an object. Other clients (well, postgres at least) return a string and then I am expected to decode it.
The issue as @samwillis mentioned is that JS JSONs will always get encoded as JSON type, not JSONB, as we can't know what it's meant to be encoded as from just looking at the parameter.
I think there are three options:
- Instead of always encoding to
JSON, we always encode toJSONB, as Postgres seems to implicitly castJSONBtoJSONwhen needed - however this will have a performance impact as these conversions will happen everywhere, all the time, with no control over them. - You can explicitly cast the parameter in the query - you still incur the above mentioned cost as it will be bound as
JSON, and then converted toJSONB, but it unblocks you in the current version of PGlite
await db.query(`select * from test where data @> $1::jsonb`, [{"hello": "world"}]);
- Provide a "parameter utility" such that PGlite knows that a
JSONparameter is actuallyJSONB, which is the optimal in terms of performance as it will be bound directly asJSONB.
import { jsonb } from 'electric-sql/pglite/parameters'
await db.query(`select * from test where data @> $1`, [jsonb({"hello": "world"})]);
I believe the optimal path is for you to be unblocked by manually casting the parameter in the query, and if/when we introduce this utility you should be able to wrap your arguments and no casting will happen anymore. What do you think @samwillis ?
There's a third option, mark the parameter as the "unknown" type.
With the extended wire protocol you can specify the type of each param with its oid or set it to 0 to mark as unknown. We use unknown for string and number types, Postgres can then just infer what they are. If we JSON.stringify all objects (as we do now) and then set the type to 0 it may let Postgres infer the type based on operator. I don't know though, needs testing.
I think we should check node-Postgres and Postgres.js and see what they do. Match at least one of them.
I think we should check node-Postgres and Postgres.js and see what they do.
@samwillis
deno-postgres: JSON.stringify: https://github.com/denodrivers/postgres/blob/main/query/encode.ts#L101
Postgres.js: JSON.stringify: https://github.com/porsager/postgres/blob/master/src/types.js#L19
node-postgres: JSON.stringify probably: https://github.com/brianc/node-postgres/blob/54eb0fa216aaccd727765641e7d1cf5da2bc483d/docs/pages/features/types.mdx#strings-by-default and https://github.com/brianc/node-postgres/blob/54eb0fa216aaccd727765641e7d1cf5da2bc483d/packages/pg/lib/utils.js#L88
Am I reading this correctly? This seems equivalent to marking the parameter as "unknown" type.
I have almost succeeded in working around this issue by stringifying objects before passing them down into pglite, but I'm fighting with arrays now. I want to use a JS array for both jsonb operations and x = ANY($1) queries, but the former wants true json syntax and the latter wants the postgres squiggly brackets {1,2,3}. God why didn't they just do json array syntax for that. I'm stuck.
Okay I've been learning more about how Postgres works. This comment is really helpful: https://github.com/porsager/postgres/discussions/939#discussioncomment-10581516
Postgres.js has solved this problem perfectly, and they do it by having Postgres infer the type of the parameters as part of the query flow.
There's a third option, mark the parameter as the "unknown" type.
Now this makes sense to me. Yes, this is exactly what Postgres.js is doing.
Thanks @alexgleason, I'll take a good look at what they do and replicate it.