peerdb icon indicating copy to clipboard operation
peerdb copied to clipboard

Blob data size exceeds limit.

Open victorlcm opened this issue 1 year ago • 2 comments

Hi, we’re experiencing this error when trying to set up a postgres mirror to s3:

ERROR:  unable to submit job: "status: Unknown, message: \"unable to start PeerFlow workflow: Blob data size exceeds limit.\", details: [], metadata: MetadataMap { headers: {\"content-type\": \"application/grpc\"} }"

we’ve investigated and it seems to be a temporal blob limitation of payloads bigger than 2mb. We have a database with approximately 50k tables, so this might be the issue. Do you know if there’s a way around this? Maybe storing this in an external location and accessing it in the temporal workflow?

Thanks!

victorlcm avatar Sep 04 '24 19:09 victorlcm

@victorlcm -- this one is a tricky issue to solve and is definitely on the roadmap. I was thinking about using a reference for the table mapping and store the actual blob in catalog.

As a temporary workaround, would you be able to use multiple mirrors or is that not an option?

iskakaushik avatar Sep 05 '24 04:09 iskakaushik

Thanks, @iskakaushik, we'll try separating it into multiple mirrors and check it that works!

victorlcm avatar Sep 05 '24 10:09 victorlcm

fixed by https://github.com/PeerDB-io/peerdb/pull/2090

heavycrystal avatar Dec 03 '24 22:12 heavycrystal