Blob data size exceeds limit.
Hi, we’re experiencing this error when trying to set up a postgres mirror to s3:
ERROR: unable to submit job: "status: Unknown, message: \"unable to start PeerFlow workflow: Blob data size exceeds limit.\", details: [], metadata: MetadataMap { headers: {\"content-type\": \"application/grpc\"} }"
we’ve investigated and it seems to be a temporal blob limitation of payloads bigger than 2mb. We have a database with approximately 50k tables, so this might be the issue. Do you know if there’s a way around this? Maybe storing this in an external location and accessing it in the temporal workflow?
Thanks!
@victorlcm -- this one is a tricky issue to solve and is definitely on the roadmap. I was thinking about using a reference for the table mapping and store the actual blob in catalog.
As a temporary workaround, would you be able to use multiple mirrors or is that not an option?
Thanks, @iskakaushik, we'll try separating it into multiple mirrors and check it that works!
fixed by https://github.com/PeerDB-io/peerdb/pull/2090