Bulk upserts fail due to missing double quotes on case sensitive table names with nextval()
Environment
- PostgreSQL version: supabase/postgres:15.1.1.78
- PostgREST version: supabase/postgrest:v12.2.0
- Operating system: MacOS
Description of issue
Hi there, I hope this is the right spot to track this ticket, but I ran in to an issue where Postgrest is not correctly double quoting my table when attempting upserts using nextval.
For example, I receive an error stating
{
error: {
code: '42P01',
details: null,
hint: null,
message: 'relation "public.fanlistsegmentfilter_id_seq" does not exist'
},
data: null,
count: null,
status: 404,
statusText: 'Not Found'
}
Where my DB table definition is:
create table
public."FanListSegmentFilter" (
id bigint generated by default as identity,
"fanListSegmentId" bigint not null,
"primaryValue" text null,
operation text not null,
field text not null,
"fieldId" text null,
"secondaryValue" text null,
code text not null,
constraint FanListSegmentFilter_pkey primary key (id),
constraint FanListSegmentFilter_fanListSegmentId_fkey foreign key ("fanListSegmentId") references "FanListSegment" (id) on update cascade on delete cascade
) tablespace pg_default;
Given that it is printing an error message of my table in all lowercase this led me to believe that it had to do with quoting. And looking in my Postgres logs I saw the following (the key bit is the nextval):
WITH pgrst_source AS (INSERT INTO "public"."FanListSegmentFilter" ("code", "fanListSegmentId", "field", "fieldId", "id",
"operation", "primaryValue",
"secondaryValue") SELECT "pgrst_body"."code",
"pgrst_body"."fanListSegmentId",
"pgrst_body"."field",
"pgrst_body"."fieldId",
"pgrst_body"."id",
"pgrst_body"."operation",
"pgrst_body"."primaryValue",
"pgrst_body"."secondaryValue"
FROM (SELECT $1 AS json_data) pgrst_payload,
LATERAL (SELECT jsonb_agg(jsonb_build_object(
'id',
nextval('public.FanListSegmentFilter_id_seq')) ||
elem) AS val
from jsonb_array_elements(pgrst_payload.json_data) elem) pgrst_json_defs,
LATERAL (SELECT "code",
"fanListSegmentId",
"field",
"fieldId",
"id",
"operation",
"primaryValue",
"secondaryValue"
FROM jsonb_to_recordset(pgrst_json_defs.val) AS _("code" text,
"fanListSegmentId" bigint,
"field" text,
"fieldId" text,
"id" bigint,
"operation" text,
"primaryValue" text,
"secondaryValue" text) ) pgrst_body
WHERE set_config('pgrst.inserted',
(coalesce(
nullif(current_setting('pgrst.inserted', true), '')::int,
0) +
1)::text,
true) <>
'0' ON CONFLICT ("id") DO UPDATE SET "code" = EXCLUDED."code", "fanListSegmentId" = EXCLUDED."fanListSegmentId", "field" = EXCLUDED."field", "fieldId" = EXCLUDED."fieldId", "id" = EXCLUDED."id", "operation" = EXCLUDED."operation", "primaryValue" = EXCLUDED."primaryValue", "secondaryValue" = EXCLUDED."secondaryValue" WHERE
set_config('pgrst.inserted', (coalesce(nullif(current_setting('pgrst.inserted', true), '')::int, 0) - 1)::text,
true) <> '-1' RETURNING 1)
SELECT '' AS total_result_set,
pg_catalog.count(_postgrest_t) AS page_total,
array []::text[] AS header,
''::text AS body,
nullif(current_setting('response.headers', true), '') AS response_headers,
nullif(current_setting('response.status', true), '') AS response_status,
nullif(current_setting('pgrst.inserted', true), '')::int AS response_inserted
FROM (SELECT * FROM pgrst_source) _postgrest_t
As you can see it is using nextval('public.FanListSegmentFilter_id_seq') instead of something like nextval('public."FanListSegmentFilter_id_seq"'). If I throw this in to DataGrip it indeed lints errors until I double quote.
I am wondering why this is happening? I use the supabase-js library: "@supabase/supabase-js": "2.45.3" and attempt to bulk upsert using
THIS DOES NOT WORK
const response = await ctx.supabase.from("FanListSegmentFilter").upsert(
filters.map((filter) => {
const filterId = Math.max(0, filter.id);
const form = formatDbFields(filter, nullableFanListSegmentFilterFields);
const item: Database["public"]["Tables"]["FanListSegmentFilter"]["Insert"] = {
...(!!filterId && { id: filterId }),
fanListSegmentId: updatedFanListSegment.id,
field: form.field,
operation: form.operation,
primaryValue: form.primaryValue,
secondaryValue: form.secondaryValue,
fieldId: form.fieldId,
code: form.code,
};
return item;
}),
{ defaultToNull: false }
);
Funnily enough if I just loop through each filter and upsert one at a time, it works fine...so something is weird with the bulk upsert
THIS DOES WORK
for (const filter of filters) {
const filterId = Math.max(0, filter.id);
const form = formatDbFields(filter, nullableFanListSegmentFilterFields);
const response = await ctx.supabase.from("FanListSegmentFilter").upsert({
...(!!filterId && { id: filterId }),
fanListSegmentId: updatedFanListSegment.id,
field: form.field,
operation: form.operation,
primaryValue: form.primaryValue,
secondaryValue: form.secondaryValue,
fieldId: form.fieldId,
code: form.code,
});
console.log(response);
}
Thanks for your time! If there's anything else I can provide please let me know.
EDIT: here is the raw unformatted Postgres log output for the error statement
"172.19.0.9 2024-08-31 07:59:20.793 UTC [1965] authenticator@postgres STATEMENT: WITH pgrst_source AS (INSERT INTO \"public\".\"FanListSegmentFilter\"(\"code\", \"fanListSegmentId\", \"field\", \"fieldId\", \"id\", \"operation\", \"primaryValue\", \"secondaryValue\") SELECT \"pgrst_body\".\"code\", \"pgrst_body\".\"fanListSegmentId\", \"pgrst_body\".\"field\", \"pgrst_body\".\"fieldId\", \"pgrst_body\".\"id\", \"pgrst_body\".\"operation\", \"pgrst_body\".\"primaryValue\", \"pgrst_body\".\"secondaryValue\" FROM (SELECT $1 AS json_data) pgrst_payload, LATERAL (SELECT jsonb_agg(jsonb_build_object('id', nextval('public.FanListSegmentFilter_id_seq')) || elem) AS val from jsonb_array_elements(pgrst_payload.json_data) elem) pgrst_json_defs, LATERAL (SELECT \"code\", \"fanListSegmentId\", \"field\", \"fieldId\", \"id\", \"operation\", \"primaryValue\", \"secondaryValue\" FROM jsonb_to_recordset(pgrst_json_defs.val) AS _(\"code\" text, \"fanListSegmentId\" bigint, \"field\" text, \"fieldId\" text, \"id\" bigint, \"operation\" text, \"primaryValue\" text, \"secondaryValue\" text) ) pgrst_body WHERE set_config('pgrst.inserted', (coalesce(nullif(current_setting('pgrst.inserted', true), '')::int, 0) + 1)::text, true) <> '0' ON CONFLICT(\"id\") DO UPDATE SET \"code\" = EXCLUDED.\"code\", \"fanListSegmentId\" = EXCLUDED.\"fanListSegmentId\", \"field\" = EXCLUDED.\"field\", \"fieldId\" = EXCLUDED.\"fieldId\", \"id\" = EXCLUDED.\"id\", \"operation\" = EXCLUDED.\"operation\", \"primaryValue\" = EXCLUDED.\"primaryValue\", \"secondaryValue\" = EXCLUDED.\"secondaryValue\"WHERE set_config('pgrst.inserted', (coalesce(nullif(current_setting('pgrst.inserted', true), '')::int, 0) - 1)::text, true) <> '-1' RETURNING 1) SELECT '' AS total_result_set, pg_catalog.count(_postgrest_t) AS page_total, array[]::text[] AS header, ''::text AS body, nullif(current_setting('response.headers', true), '') AS response_headers, nullif(current_setting('response.status', true), '') AS response_status, nullif(current_setting('pgrst.inserted', true),'')::int AS response_inserted FROM (SELECT * FROM pgrst_source) _postgrest_t"
What's strange is that the bulk upsert option works if there are no values ON CONFLICT (id) (so just inserting new items) but if there are values that conflict with "id" the bulk update fails
Turns out this has been fixed accidentally by my "refactor" commit 1747a4fcc4708b33cfd4536588253a9bac6a10be in #3644. IIRC, this has not been released, yet, and is only available in the latest devel version, so far.
We should probably still add a test-case, though.
oh nice! glad it was already caught, thanks :)