Bug: [pglite-tools] pg dump throw error when fs is opfs-ahp
Summary
When using pgDump in web worker and fs is opfs-ahp,the function throw Error: pg_dump failed with exit code 1.
Detail
Error Message:
Error: pg_dump failed with exit code 1.
Error message: could not find a "pg_dump" to execute
pg_dump: error: query failed: unexpected response from server; first received character was "R"
pg_dump: detail: Query was: SELECT pg_catalog.set_config('search_path', '', false);
at Mt (pg_dump.ts:193:11)
at async init (opfs-worker-pg-dump.js:52:20)
Example
Here is the example repo: https://github.com/lastsunday/pglite/tree/opfs_pg_dump_test
import { PGlite } from "../dist/index.js";
import { worker } from "../dist/worker/index.js";
import { pgDump } from "./pglite-tools/pg_dump.js";
worker({
async init() {
const pg = new PGlite("opfs-ahp://my-test-db2");
// If you want run any specific setup code for the worker process, you can do it here.
try {
console.log("Starting...");
const start = performance.now();
console.log("Waiting for ready...");
await pg.waitReady;
console.log("Ready! Took", performance.now() - start, "ms");
console.log("Creating table...");
await pg.exec(`
CREATE TABLE IF NOT EXISTS test (
id SERIAL PRIMARY KEY,
name TEXT
);
`);
console.log("Inserting data...");
await pg.exec("INSERT INTO test (name) VALUES ('test');");
console.log("Selecting data...");
const res = await pg.exec(`
SELECT * FROM test;
`);
console.log(res);
// Transaction example:
console.log("Transaction example...");
await pg.transaction(async (tx) => {
await tx.exec("INSERT INTO test (name) VALUES ('test2');");
await tx.exec("INSERT INTO test (name) VALUES ('test3');");
});
console.log("Selecting data...");
const res2 = await pg.exec(`
SELECT * FROM test;
`);
console.log(res2);
console.log('Dumping database...')
const dump = await pgDump({ pg })
console.log(dump)
} catch (e) {
console.error(e);
}
return pg;
},
});
console.log("Worker process started");
<!doctype html>
<html>
<head>
<title>PGlite Worker Example</title>
<link rel="stylesheet" href="./styles.css" />
<script src="./utils.js"></script>
<script type="importmap">
{
"imports": {
"@electric-sql/pglite/worker": "../dist/worker/index.js"
}
}
</script>
</head>
<body>
<h1>PGlite OPFS PG Dump Example</h1>
<div class="script-plus-log">
<div class="scripts">
<h2>Worker Thread - <code>opfs-worker-pg-dump.js</code></h2>
<div class="script" rel="./opfs-worker-pg-dump.js"></div>
<h2>Main Thread</h2>
<script type="module">
import { PGliteWorker } from "@electric-sql/pglite/worker";
const pg = new PGliteWorker(
new Worker(new URL("./opfs-worker-pg-dump.js", import.meta.url), {
type: "module",
})
);
</script>
</div>
<div id="log"></div>
</div>
</body>
</html>
@lastsunday Thank you for this! We will address this issue as soon as time allows.
While we fix this, here is a workaround: clone your main pglite instance to another instance that doesn't use persistence, and dump that one instead:
it('specify datadir: should dump a database with tables and data', async () => {
const pg = await PGlite.create({
dataDir: '/tmp/pg_dump',
})
// Create test tables and insert data
await pg.exec(`
CREATE TABLE test1 (
id SERIAL PRIMARY KEY,
name TEXT
);
INSERT INTO test1 (name) VALUES ('test1-row1');
CREATE TABLE test2 (
id SERIAL PRIMARY KEY,
value INTEGER
);
INSERT INTO test2 (value) VALUES (42);
`)
const pgMem = await pg.clone()
const dump = await pgDump({ pg: pgMem })
const content = await dump.text()
// Check for table creation
expect(content).toContain('CREATE TABLE public.test1')
expect(content).toContain('CREATE TABLE public.test2')
// Check for data inserts
expect(content).toContain('INSERT INTO public.test1')
expect(content).toContain("'test1-row1'")
expect(content).toContain('INSERT INTO public.test2')
expect(content).toContain('42')
})
It sound not the perfect way and just a temporary solution for the small data record in pglite instance. Maybe occur out off memory error when the main pglite instance data is huge and execute the clone method.
fixed by https://github.com/electric-sql/pglite/pull/824