node-postgres
node-postgres copied to clipboard
JSONB arguments not treated consistently
According to docs, there is no special treatment required for JSONB bind arguments. In practice there is:
- When the argument is a single object, it is bound correctly
- When the argument is an Array, it is necessary to call JSON.stringify on the args.
Here's a test case that reproduces the issue:
const { Client } = require('pg');
const { inspect } = require('util');
async function doIt() {
const client = new Client();
console.log('Connecting...');
await client.connect();
console.log('Connected');
const a = [
{
the: 'quick',
brown: 'fox',
jumps: 'over the',
lazy: 'dog',
some: { other: 'object', dilroy: ['hello', 'world'], numpty: new Date() }
}
];
console.log('Inserting...');
const result = await client.query(`insert into sb_json(c1) values ($1) returning *`, [a]);
console.log(`result = ${inspect(result, { depth: null })}`);
console.log('Disconnecting...');
await client.end();
}
doIt()
.then(() => console.log('All done'))
.catch(e => console.log(inspect(e)));
Current documented contract is that no special treatment is required.
Possible solution:
- Document current behavior. This is the less ideal solution, as arguably it is better to be consistent. Never call JSON.stringify on args, or always do so. Not one case for arrays, and on case for single items.
- Inspect the contents of the array. If it contains all primitive types insert for that type. Otherwise call JSON.stringify internally. This operation would run in O(n) time - for a very large array it would be slow.
Likewise sending an object to the PostgreSQL server via a query from node-postgres, node-posgres will call JSON.stringify on your outbound value, automatically converting it to json for the server.
With the existence of jsonb[]
and without knowing the desired types ahead of time, no solution by inspecting JavaScript types can work.
See also #442
Yeah, I see what you're saying. It's a bit funky that node-pg assumes to stringify Objects but not Arrays. I think making it more consistent, or rather, no incoming magic (I like the magic coming out though!) would be for the best. It seems a little arbitrary, especially now that JSON has been added to the data-type mix, that JS arrays are assumed to be PG Arrays, when they could be type json or even hstore. My preference would be to make no assumptions and leave it up to the library consumer. I'm not sure if that means node-pg should JSON.stringify Arrays and Objects by default, as that would be trading one magic for another. But it certainly does seem like a sane default.
^-- This comment from the linked ticket (#442) resonates with me. Better to be consistent in behavior.
- Never call JSON stringify.
- Always call it. (This appears to be impossible).
https://node-postgres.com/features/types would be a good place to mention the current behaviour as it seems to be mentioned only in github issues right now
@jasperblues @charmander we are not intended to use pg arrays. Is there any way to say "treat javascript arrays always as pg json"? We use knex if it's matter.
I see the comment, probably from @brianc , but I don't see the way how to replace the array handling: https://github.com/brianc/node-postgres/blob/master/packages/pg/lib/utils.js#L47
Looked for this answer for so long. If not fixable (not sure I understood the whole issue), this should at least be noted as a warning in the doc 👍
As a workaround you could 'wrap' your arrays in an object and turn that into the JSONB text you want:
class JsonBArrayParam {
constructor(actualArray) { this.actualArray = actualArray }
toPostgres() {
return JSON.stringify(this.actualArray)
}
}
client.query('SELECT FOO WHERE Bar = $1', [new JSonBArrayWrapper(['one', 'two'])
or something similar to that. does that work?
see here: https://github.com/brianc/node-postgres/blob/master/packages/pg/lib/utils.js#L80
Because we don't use pg arrays, we've implemented following workaround:
// @ts-ignore
import pgUtils from 'pg/lib/utils.js'
// a workaround to force storage of array as json until
// https://github.com/brianc/node-postgres/issues/2012
// is fixed
const originalPrepareValue = pgUtils.prepareValue
pgUtils.prepareValue = <T>(val: T): T | string => {
if (Array.isArray(val)) {
return JSON.stringify(val)
}
return originalPrepareValue(val)
}
Maybe it helps someone.
Just hit this too. Ugly. Can we provide some kind of bind directive to tell PG how to interpret the array, or what the required behavior for a given insert statement is.
If the library would expose the types
parameter for prepared statements, there might be a way to do conditional serialization based on the type.
The API could look something like this:
pool.query({
text: 'SELECT $1, $2, $3',
values: ['1', 2, [1, 2, 3]],
paramTypes: [null, null, OID.JSONB]
);
I'm not sure if it's any better then wrapping the values in custom serialized type as suggested in https://github.com/brianc/node-postgres/issues/2012#issuecomment-607415271.
This and https://github.com/brianc/node-postgres/issues/1462 seem to be duplicates of each other
Is it possible to use setTypeParser
to intercept jsonb, inspect if it's an array, then stringify it? Which type id can be used for this?
@rightaway No, type parsers are for data coming from the PostgreSQL server. Serialization happens in utils.prepareValue
, and it doesn’t have parameter type information.
Thanks for the stringify
workaround!
How could run this other query? stringify
doesn't work in this case:
SELECT * FROM product where tags ?| ARRAY['tag1', 'tag2'];
I tried await pgclient.query(
SELECT * FROM product WHERE tags ?| $1, [JSON.stringify(['tag1', 'tag2'])])
but the result is
SELECT * FROM product WHERE tags ?| ["tag1","tag2"]
ERROR unhandledRejection malformed array literal: "["tag1","tag2"]"
The workaround provided in https://github.com/brianc/node-postgres/issues/2012#issuecomment-607463882 only works for them, because they do not use any SQL array inputs.
The jsonb ?| text[] → boolean
operator expects an SQL text
array on the right hand side, not a json
or jsonb
array, so you shouldn't JSON.stringify
your inputs in this case.
Thank you!