postgrest-js
postgrest-js copied to clipboard
Support bulk update
Feature request
Is your feature request related to a problem? Please describe.
I'd love to have the ability to update multiple rows at once, with different data for each row. It seems that right now update only accepts one filter and updates all the rows that match that filter with the same data. For my use case, I'd like to update different rows with different data in one query. The only way that I can do that right now is to make multiple network requests, but that doesn't seem very efficient (especially if I need to potentially update dozens of rows).
Describe the solution you'd like
A way to update multiple rows with different data for each row in one network request.
Describe alternatives you've considered
It seems like it's possible to do this using raw SQL (https://stackoverflow.com/a/18799497), or you can make multiple network requests, as I mentioned earlier. Unfortunately, both are non-ideal solutions. It would be great to have this functionality be supported directly in supabase-js.
Interesting feature @churichard, thanks a lot for proposing it and suggesting a possible implementation.
We'll definitely consider supporting it out-of-the-box. But meanwhile, you can create a SQL function that will allow you to use the update..from syntax. Example:
-- example table
create table projects(
id int
, name text
);
insert into projects values(1, 'project-1'), (2, 'project-2');
-- function that wraps the update..from syntax
create or replace function update_many_projects(payload json) returns setof projects as $$
update projects as p set name = x.name
from (
select * from json_populate_recordset(null::projects, payload)
) as x(id, name)
where p.id = x.id
returning p.*;
$$ language sql;
And then you can use the function through supabase.rpc. It'd be a single request:
const { data, error } = await supabase
.rpc('update_many_projects',
{payload: [{"id": 1, "name": "updated-project-1"}, {"id": 2, "name": "updated-project-2"}]})
@churichard I forgot about this option, but in fact, you can already do what you want with an upsert.
const { data, error } = await supabase
.from('projects')
.upsert([{"id": 1, "name": "updated-project-1"}, {"id": 2, "name": "updated-project-2"}])
@churichard The upsert is cleaner IMO. I'll close this for now, but let me know if that doesn't work you.
I've tried out the upsert, and it seemed to work at first glance. However, because upsert creates a new record if it doesn't exist, it requires me to pass in all of the properties that are needed for the record. This can result in a race condition where if one of the properties is updated elsewhere, then it might be overwritten by the old property. It also requires me to know all of the properties for the particular record that I'm updating, which is not ideal.
I still think a dedicated update would be better because it would allow me to pass in only the properties that I would need, and it would return an error if the record doesn't exist (rather than creating the record). The race condition, in particular, is a case where update would be strictly better than upsert.
I still think a dedicated update would be better because it would allow me to pass in only the properties that I would need, and it would return an error if the record doesn't exist (rather than creating the record).
Ah, that's not true. Nor a regular UPDATE nor your proposed multi-row UPDATE above will fail or return an error when the id is not found, it will simply succeed with "no row updated"(UPDATE 0).
also requires me to know all of the properties for the particular record that I'm updating
I think this one will be lifted once https://github.com/supabase/postgrest-js/issues/173 is solved.
This can result in a race condition where if one of the properties is updated elsewhere, then it might be overwritten by the old property.
To solve concurrent updates I think we need an etag. But this is a separate issue being tracked at: https://github.com/PostgREST/postgrest/issues/1069.
I still think a dedicated update would be better because it would allow me to pass in only the properties that I would need, and it would return an error if the record doesn't exist
Hm, the only way I see we could err is by ensuring that the number of rows updated is the same as the number of rows passed in the payload.
I still think a dedicated update would be better because it would allow me to pass in only the properties that I would need, and it would return an error if the record doesn't exist (rather than creating the record).
I do agree in that UPSERT will not be of help if you don't want to create another record and instead fail(ideally) or succeed silently ignoring non-matches.
I think currently upsert doesn't work. this worked for me
const { data, error } = await supabase .from('order_products') .insert([{ some_column: 'someValue' },{ some_column: 'someValue' }], { upsert: true })
@muhammednasimda Hmm, you might be using an older version of supabase-js. The upsert function was added since v1.7.0 and does the same thing as insert with { upsert: true }.
Should be solved when https://github.com/PostgREST/postgrest/issues/1959 is implemented.
New requests for this feature:
- https://github.com/supabase/supabase/discussions/3447
How do you upsert() when the target table has multiple indices (enforcing some unique constrains)? I started this discussion, but would love to hear input from this thread too. It seems like there isn't a good way to "swap" values that have a unique constraint applied.
I tried your example @steve-chavez and it works like a charm
So I wanted to create my own function:
create or replace function update_blocks_order(payload json) returns setof page_blocks as $$
update page_blocks as pb set display_order = x.display_order
from (
select * from json_populate_recordset(null::page_blocks, payload)
) as x(id, display_order)
where pb.id = x.id
returning pb.*;
$$ language sql;
but I had the error column reference "display_order" is ambiguous. So I changed the function to
create or replace function update_blocks_order(payload json) returns setof page_blocks as $$
update page_blocks as pb set display_order = x.new_display_order
from (
select * from json_populate_recordset(null::page_blocks, payload)
) as x(id, new_display_order)
where pb.id = x.id
returning pb.*;
$$ language sql;
Then called with:
await supabase.rpc('update_blocks_order', {payload: [{ "id": 11, "new_display_order": 1 }, { "id": 1, "new_display_order": 2 }]}));
but I can't make it work. new_display_order is always NULL. I tried display_order as well with the same NULL result...
Could you help me? What am I doing wrong?
Finally I did it, thanks to the function:
create or replace function update_blocks_order(payload json) returns setof page_blocks as $$
update page_blocks as p set display_order = x.display_order
from (
select id, display_order from json_populate_recordset(null::page_blocks, payload)
) as x
where p.id = x.id
returning p.*;
$$ language sql;
Any update to this topic?
Is there a difference between projects that are newer or older? I have used upsert to update an order field successfully in newer projects, but in an older one I get this error: Data that I'd like to update
const toUpsert = [
{ id: 345, order: 0, parent_id: 68 },
{ id: 347, order: 1, parent_id: 68 },
{ id: 353, order: 2, parent_id: 68 },
{ id: 350, order: 3, parent_id: 68 },
{ id: 351, order: 4, parent_id: 68 },
{ id: 346, order: 5, parent_id: 68 },
{ id: 348, order: 6, parent_id: 68 },
{ id: 349, order: 7, parent_id: 68 },
{ id: 352, order: 8, parent_id: 68 },
{ id: 354, order: 9, parent_id: 68 }
];
const { error: upsertError } = await dbAdmin
.from("journals_journalitem")
.upsert(toUpsert);
Error
{
message: 'null value in column "name" violates not-null constraint',
code: '23502',
details: 'Failing row contains (345, null, null, null, null, null, null, 68, null, null, null, null, 0).',
hint: null
}
So it looks like it tries to insert (?)
// edit: Ok, it looks like some of my data is indeed violating the not-null constraint (maybe because I have updated the model afterwards) - so I guess, upsert validates this and raises an error
@steve-chavez I still think there is a need for a proper bulk update function.
When I try to upsert a number of records, it seems to fire off any INSERT database triggers I have on the data, which is not what I want, since I only want an update. This is true even if all the records existed already – could upsert be modified to only update records which already existed?
I still think there is a need for a proper bulk update function.
@andrashann Yes, I agree. We're working on the feature here https://github.com/PostgREST/postgrest/pull/2311.
Edit: Unfortunately we had an issue with pg-safeupdate and this feature was rolled back for v10.
As a workaround, one can:
- select the lines matching the criteria (let's say the column
id)
const { data: existingRowsIds } = await supabaseClient
.from<{ id: string }>('myTable')
.select('id')
.in('id', ids);
- filter out the data absent from the first request result
const existingIdsSet = new Set<string>();
existingRowsIds?.reduce((prev, { id }) => prev.add(id), existingIdsSet);
const rowsToUpdate = initialRows.filter((row) => existingIdsSet.has(row.id));
- and lastly do an upsert of the
rowsToUpdateset. It is the initial set stripped of new lines.
await supabaseClient
.from('myTable')
.upsert(rowsToUpdate, { onConflict: 'myUniqueKey' });
It's not perfect, still requires to execute 2 requests (but potentially instead of dozens), and might have some side effects with limit(), but that's good enough for me for the time being.
@andrashann but it will not fix your trigger issue as an upsert triggers both the insert and update triggers :/
Is anything being done here? I would need this functionality, would need to fall back on manual SQL queries only for this as I have the same problem as @andrashann
do you care to share some snippets for manual SQL? I also stuck on doing 50+ loops for this bulk update issue.
do you care to share some snippets for manual SQL? I also stuck on doing 50+ loops for this bulk update issue.
Sorry, I instead switched to Prisma ORM, supabase api was not sophisticated enough for my project
This would be a great feature, excited to get it :)
+1 from me, this is definitely needed as upsert does not do the same job
Where are we at on this with postgrest support?
any updates?
Shame to see no updates here, i get a list of 200 records that exists already, that has to be updated each day. i would love to BULK update it instead of doing one at a time
+1 for this feature.
I need to update some embedding in bulks and doing so 1 by 1 is def not ideal.
I guess I could try the bulk upsert?
const { data, error } = await supabase
.from('countries')
.upsert([
{ id: 1, name: 'Albania' },
{ id: 2, name: 'Algeria' },
])
.select()
Edit: The bulk upsert is working very well!
I still think there is a need for a proper bulk update function.
@andrashann Yes, I agree. We're working on the feature here PostgREST/postgrest#2311.
Edit: Unfortunately we had an issue with pg-safeupdate and this feature was rolled back for v10.
Is there a new PR or issue where the feature is being worked on now, since the roll-back? I'm still hoping for a way to bulk update one or two column in many rows without having to fetch and then pass back all of the other values in each row.
I need a bulk UPDATE, not UPSERT.
I need to soft delete multiple resources at the same time.
This operation should not cause an INSERT to be executed.
I don't want to know, and shouldn't know, anything other than the id and the current time to go in the delete_at.
Is this the only way to do it via rpc?
Hopefully there will be an update on anything soon.
I still think there is a need for a proper bulk update function.
@andrashann Yes, I agree. We're working on the feature here PostgREST/postgrest#2311. Edit: Unfortunately we had an issue with pg-safeupdate and this feature was rolled back for v10.
Is there a new PR or issue where the feature is being worked on now, since the roll-back? I'm still hoping for a way to bulk update one or two column in many rows without having to fetch and then pass back all of the other values in each row.
Seems like this is the one. I am waiting for this feature too!
I'd also like to add my voice to this request. I didn't see this argument given by anyone else, so: My table has an RPC, and while update requires "update" privileges, upsert requires "delete and insert" privileges. And this will mess with the roles system we've implemented.