drizzle-orm
drizzle-orm copied to clipboard
[BUG]: jsonb always inserted as a json string when using postgres-js.
What version of drizzle-orm are you using?
0.26.5
What version of drizzle-kit are you using?
0.18.1
Describe the Bug
Inserting an object into a postgres jsonb field with db.insert only inserts a string when using the postgres-js.adapter.
Expected behavior
With the pg package, an object is inserted using the code below, which is the expected behavior.
With the postgres-js package, a string is inserted into the table using the same code.
Environment & setup
drizzle packages as above, and, "pg": "8.11.0" and "postgres": "3.3.5"
schema.ts: import { pgTable, jsonb } from "drizzle-orm/pg-core";
export const logs = pgTable("log", { line: jsonb("line").$type
load.ts:
let lines: { line: object }[] = []; let n = 0;
for await (const line of rl) { const lineObj = JSON.parse(line); lines.push({ line: lineObj });
if (++n > numLines) {
await runFunction(lines);
lines = [];
n = 0;
}
}
await runFunction(lines);
runFunction: async (lines) => { await db.insert(logs).values(lines); }
I was working on a fix there, its kinda stale as would need to have different behavior for pg as it expects arrays to be passed as a string https://github.com/drizzle-team/drizzle-orm/pull/666
same issue here
same issue here +1
+1 - json/jsonb shouldn't be listed in the docs as a supported type when this bug exists.
My workaround was to use a db.execute and pass objects into the sql`` template.
const statement = sql`
INSERT INTO wikidata_article (wikidata_id, category, grade, en_raw_length, en_url_title, labels, sitelinks)
VALUES (${articleDetails.wikiDataId}, ${articleDetails.category}, ${articleDetails.grade}, ${articleDetails.enBytes}, ${articleDetails.enUrlTitle}, ${articleDetails.labels}, ${articleDetails.sitelinks})
ON CONFLICT (wikidata_id) DO UPDATE
SET
category = ${articleDetails.category},
grade = ${articleDetails.grade},
en_raw_length = ${articleDetails.enBytes},
en_url_title = ${articleDetails.enUrlTitle},
labels = ${articleDetails.labels},
sitelinks = ${articleDetails.sitelinks}
`;
await db.execute(statement);
same issue. :(
My workaround was to use a db.execute and pass objects into the sql`` template.
This works well as a temporary solution!
You could also do a hybrid approach where you call
...
await db
.insert(table)
.values({
id: id,
...
})
...
and then call
await db.execute(
sql`UPDATE table set property = ${property} WHERE id = ${id}`
)
afterward to handle the value that's jsonb. Note that because this would be two separate calls, I would only recommend this if you're inserting a lot of columns at once/constantly changing your schema and want to maximize type safety (if you rename your columns in schema.ts, it would update in the db.insert syntax but not db.execute(sql... syntax).
Is not arbitrary string failing to be detected as json/b in postgres? I've stumbled over this today as I've realized that somehow all values are escaped and the entire object is treated as a scalar. I have this workaround with sql helper.
const content = { title: 'test', ... };
// before
await db
.update(Stories)
.set({ content })
.where(eq(Stories.id, story.id));
// after
await db
.update(Stories)
.set({
content: sql`${content}::jsonb`,
})
.where(eq(Stories.id, story.id));
Looking at the complexity of postgres json handling, one can appreciate drizzle design to go down to plain sql/functions without a need to have it abstracted or non-existent as other "ORMs". I end up doing something like this:
...set({
content: sql`jsonb_set(content, '{title}', content->'title' || '{"text": "Hello There"}'::jsonb)`
})
I too was able to bypass the issue with this bug by wrapping my values call with sql``:
const product = await tx
.insert(products)
.values({
entityId: input.entityId,
eventType: input.eventType,
payload: sql`${input.payload}::jsonb`,
})
The workarounds discussed here will fails when using an array instead of an object. My column type is []::jsonb.
This is the code I am using:
const example = {identifier: "xyz", hostnames: ["foo.bar.com", "bar.foo.com"]}
await db.insert(assets).values(input).onConflictDoUpdate({
target: assets.identifier,
set: input,
});
return db.execute(
sql`UPDATE assets SET hostnames = ${input.hostnames} WHERE identifier = ${input.identifier}`,
);
The constructed query looks like this, which is of course wrong because it is using one argument for each array element:
Query: UPDATE assets SET hostnames = ($1, $2)::jsonb WHERE identifier = $3 -- params: ["foo.bar.com", "bar.foo.com", "xyz"]
PostgresError: cannot cast type record to json
Any suggestions?
Edit: Using this custom jsonb type fixed it for me: https://github.com/drizzle-team/drizzle-orm/pull/666#issuecomment-1602918513
@rverton same issue with me I guess the temp workaround is to use an object instead =\
I used raw sql but it didn't work for me because it stores the array as object. Found a work around using custom type here and it worked for me. I don't need to use raw sql, and the array will still get stored as array.
Any updates? Do you accept a PR?
I am curious why this was implemented in such way at first place and have not been patched to be the expected beahviour for months.
a better workaround https://github.com/drizzle-team/drizzle-orm/pull/666#issuecomment-1809339114
In case someon
+1 - json/jsonb shouldn't be listed in the docs as a supported type when this bug exists.
My workaround was to use a db.execute and pass objects into the sql`` template.
const statement = sql` INSERT INTO wikidata_article (wikidata_id, category, grade, en_raw_length, en_url_title, labels, sitelinks) VALUES (${articleDetails.wikiDataId}, ${articleDetails.category}, ${articleDetails.grade}, ${articleDetails.enBytes}, ${articleDetails.enUrlTitle}, ${articleDetails.labels}, ${articleDetails.sitelinks}) ON CONFLICT (wikidata_id) DO UPDATE SET category = ${articleDetails.category}, grade = ${articleDetails.grade}, en_raw_length = ${articleDetails.enBytes}, en_url_title = ${articleDetails.enUrlTitle}, labels = ${articleDetails.labels}, sitelinks = ${articleDetails.sitelinks} `; await db.execute(statement);
This solution works great, but I had a problem with it because I insert an array into jsonb field and for some reason when I passed this array and it had 2+ elements, the elements were destructured, i.e.:
query:
const array_for_jsonb_field = [{"key1": 1}, {"key2": 2}]
sql`INSERT INTO "my_table" VALUES (${id}, ${val}, ${array_for_jsonb_field}`)
result:
INSERT INTO "my_table" VALUES ($1, $2, ($3, $4)) // $3 and $4 should be a single array but this array with 2 elements got destructured into 2 params
In case someone encounters the same problem, use can do the following:
const array_for_jsonb_field = [{"key1": 1}, {"key2": 2}]
sql`INSERT INTO "my_table" VALUES (${id}, ${val}, ${new Param(array_for_jsonb_field)}`) // I wrapped array with `new Param`
result:
INSERT INTO "my_table" VALUES ($1, $2, $3)
My use case:
Table Schema
availability: jsonb('availability')
API Response
Table Plus
But if i use
availability: sql`${req.body.availability}::jsonb` as AvailabilityT
Then API Resonse is same but Table Plus
Using this https://github.com/drizzle-team/drizzle-orm/pull/666#issuecomment-1602918513 also fixed issue can confirm
Just wanted to add above when i use this
export const customJsonb = customType<{ data: any }>({
dataType() {
return 'jsonb';
},
toDriver(val) {
console.log('🚀 ~ toDriver:', val);
return val as any;
},
fromDriver(value) {
console.log('🚀 ~ fromDriver', value);
if (typeof value === 'string') {
try {
return JSON.parse(value) as any;
} catch {}
}
return value as any;
},
});
It does not sort object in same order as i passed, this happens under the fromDriver
Using sql same issue
sql`${availability}::jsonb` as AvailabilityT
And then API res is also messed up
I just contributed to the bounty on this issue.
Each contribution to this bounty has an expiry time and will be auto-refunded to the contributor if the issue is not solved before then.
To make this a public bounty or have a reward split, the maintainer can reply to this comment.
I just contributed to the bounty on this issue:
https://until.dev/bounty/drizzle-team/drizzle-orm/724
The current bounty for completing it is $70.00 if it is closed within 27 days, and decreases after that.
Others can also contribute to the bounty to increase it.
Just wanted to add above when i use this
export const customJsonb = customType<{ data: any }>({ dataType() { return 'jsonb'; }, toDriver(val) { console.log('🚀 ~ toDriver:', val); return val as any; }, fromDriver(value) { console.log('🚀 ~ fromDriver', value); if (typeof value === 'string') { try { return JSON.parse(value) as any; } catch {} } return value as any; }, });It does not sort object in same order as i passed, this happens under the
fromDriverUsing `sql` same issue
sql`${availability}::jsonb` as AvailabilityTAnd then API res is also messed up
![]()
@MariuzM I don't think this is an issue, but it's by design. The JSONB stores JSON data as a binary representation of the JSONB value, which eliminates whitespace, duplicate keys, and key ordering. If key ordering is important for you, use JSON instead, but you'll lose all the other benefits of JSONB.
Just wanted to add above when i use this
export const customJsonb = customType<{ data: any }>({ dataType() { return 'jsonb'; }, toDriver(val) { console.log('🚀 ~ toDriver:', val); return val as any; }, fromDriver(value) { console.log('🚀 ~ fromDriver', value); if (typeof value === 'string') { try { return JSON.parse(value) as any; } catch {} } return value as any; }, });It does not sort object in same order as i passed, this happens under the
fromDriverUsing `sql` same issue
sql`${availability}::jsonb` as AvailabilityTAnd then API res is also messed up
![]()
@MariuzM as indicated by @pfurini its postgres jsonb design.
By contrast, jsonb does not preserve white space, does not preserve the order of object keys, and does not keep duplicate object keys. If duplicate keys are specified in the input, only the last value is kept.
https://www.postgresql.org/docs/16/datatype-json.html
This patch-package is working for me, from https://github.com/drizzle-team/drizzle-orm/pull/1785
diff --git a/node_modules/drizzle-orm/postgres-js/driver.js b/node_modules/drizzle-orm/postgres-js/driver.js
index 7e48e8c..219e0a0 100644
--- a/node_modules/drizzle-orm/postgres-js/driver.js
+++ b/node_modules/drizzle-orm/postgres-js/driver.js
@@ -12,6 +12,8 @@ function drizzle(client, config = {}) {
client.options.parsers[type] = transparentParser;
client.options.serializers[type] = transparentParser;
}
+ client.options.serializers['114'] = transparentParser;
+ client.options.serializers['3802'] = transparentParser;
const dialect = new PgDialect();
let logger;
if (config.logger === true) {
The patch from @arjunyel works for me. What's holding us back from merging?
@timsuchanek my understanding is the issue of backwards compatibility, what do they do with people who already inserted JSON with the wrong format
@arjunyel I found this article that shows you how to convert data from the escaped json back to true json. Maybe we can include this in the docs or have a migration path for folks with already escaped json
https://dev.to/mrmurphy/so-you-put-an-escaped-string-into-a-json-column-2n67
This is urgently needed to be fixed
I just wanted to point out that this issue breaks creating and editing jsonb fields via drizzle-kit studio.
☝️ Those of you who applied sql helper workaround should be careful manipulating data through studio.
I too was able to bypass the issue with this bug by wrapping my values call with sql``:
const product = await tx .insert(products) .values({ entityId: input.entityId, eventType: input.eventType, payload: sql`${input.payload}::jsonb`, })
Out of curiosity, would this not introduce a possibility for SQL injection? Or does doing ::jsonb coerces the data into jsonb before running whatever that is in input.payload through the sql function?
There has got to be a solution. Major version change if needed but the current state is really broken!
@tonyxiao Agreed, this is a major issue that needs to be addressed
Using `sql` same issue
And then API res is also messed up
Using `sql` same issue
And then API res is also messed up