cli
cli copied to clipboard
Remove "Body exceeded 1mb limit" error
Hi there,
I'm trying to seed my local database with 1000 users and 1000 posts.
However I'm greeted with the following error:
"Body exceeded 1mb limit"
So postgres is complaining that the seed.sql
file is too big.
I've tested inserting a smaller number of rows and it works fine.
Any ideas how I could get around this?
Thanks!
Thanks for this @vbylen - transferring to the CLI repo so that the team can check it out!
We have moved to execute seed sql using pgx, hence avoiding the docker api limit. Feel free to open this ticket again if the latest version doesn't work for you.
Hello @sweatybridge,
I've currently toying around with a seed file that's around 3mb which is full of mostly JSON plus a few functions for inserting.
I'm running 1.8.2 and I'm getting the following error while trying to seed...
Error: bufio.Scanner: token too long
.
I also tried to seed in the dashboard (local) but the limit there is still 1mb (Body exceeded 1mb limit
).
Is there another way to seed large amounts of data?
Edit: I'll need to do the same in my cloud instance of supabase too. I assume the 1mb will still be an issue on that dashboard?
This data is being pulled across from another platform and includes user accounts.
:tada: This issue has been resolved in version 1.8.4 :tada:
The release is available on:
Your semantic-release bot :package::rocket:
Hi @eeston , we have made the scanner buffer size configurable. You might need to play around with buffer sizes to fit your seed file. Perhaps 5mb could be a good start
SUPABASE_SCANNER_BUFFER_SIZE=5mb supabase db reset
Let me know if it works for you locally.
I'll need to do the same in my cloud instance of supabase too. I assume the 1mb will still be an issue on that dashboard?
Yes, the hosted dashboard has the same body size limit. We will look into extending the CLI to push your seed data to cloud instance using supabase db push
. The related ticket is https://github.com/supabase/cli/issues/160
Fantastic...I'll test this at the weekend. Thanks for your help @sweatybridge
Hi @sweatybridge,
~~Just had a play around with this and unfortunately it still fails with the same error. Not a big deal locally but for the hosted dashboaed I think I'll end up splitting the data into multiple chunks.~~ I assume the csv upload option is also limited to 1mb?
Looks like I was a minor version behind. This has been addressed in 1.8.4 and works perfectly. Ta!
I assume the csv upload option is also limited to 1mb?
I just checked with our support team. The csv upload option does not impose such size limits. You should be able to upload 3mb of seed data using it.
Is there another way to seed large amounts of data?
The best way to seed a lot of data is through the COPY command: https://www.postgresql.org/docs/current/sql-copy.html
It looks like you've go mostly JSON data - I have a small tutorial which could help with that: https://paul.copplest.one/knowledge/tech/postgres-data.html
Thanks for the info @kiwicopple.
I ended up inserting the data in 5000row chunks...wasn't too much trouble. Good to know for next time though! 👍
Hi guys, sorry for getting into this discussion 🙌 . I'm facing the same error i have installed version 1.42.7 (installed with brew today).
@ricardosikic have you tried setting SUPABASE_SCANNER_BUFFER_SIZE=5MB
or larger value before supabase start?
Hi no, I just ran supabase db remote commit
and then dumped the production data with --data-only
. Where is that line located config.toml?
It's a env var so you can do it like this
SUPABASE_SCANNER_BUFFER_SIZE=5MB supabase start
Hi ✋, didn't work. i think is the format of my dump.
What's the size of your seed.sql file? Could you also post the cli logs here?
hi, 4kb. But the format is like. I think is how images for a post are stored.
--
-- Data for Name: objects; Type: TABLE DATA; Schema: storage; Owner: supabase_storage_admin
--
COPY storage.objects (id, bucket_id, name, owner, created_at, updated_at, last_accessed_at, metadata) FROM stdin;
d2-86ba-5ec4dfe1891d avatars 0.4968305670106038.jpg e-4587-9cf7-468aed9c10e8 2023-02-12 01:02:38.420795+00 2023-02-12 01:02:38.892079+00 2023-02-12 01:02:38.420795+00 {"eTag": "\\"700f2fc6bb91124e58b5ad51cff51449\\"", "size": 268015, "mimetype": "image/jpeg", "cacheControl": "max-age=3600", "lastModified": "2023-02-12T01:02:39.000Z", "contentLength": 268015, "httpStatusCode": 200}
4-48cf-9635-e934e5083988 logo-image logo/.emptyFolderPlaceholder \N 2023-02-25 19:34:24.734096+00 2023-02-25 19:34:24.82976+00 2023-02-25 19:34:24.734096+00 {"eTag": "\\"d41d8cd98f00b204e9800998ecf8427e\\"", "size": 0, "mimetype": "application/octet-stream", "cacheControl": "max-age=3600", "lastModified": "2023-02-25T19:34:25.000Z", "contentLength": 0, "httpStatusCode": 200}
@ricardosikic I see, yup the error message is different from what's posted previously in this thread.
Also, is this from db dump --data-only
or regular pg_dump? We are using column inserts so I'm surprised to see copy statements here.
Hi, yes the with the command and your las advice my error Error: bufio.Scanner: token too long.
disappeared. It was related with Bufio error. Thanks 👍
Hi @jadghadry, you can set env vars on powershell like this
$Env:SUPABASE_SCANNER_BUFFER_SIZE = '5mb'
supabase start
@sweatybridge Hello, I use supabase, docker host which docker container should take this env?
which docker container should take this env?
This env is set on your host machine where supabase cli in installed. You don't need to set it inside any container.