pglite icon indicating copy to clipboard operation
pglite copied to clipboard

Support for cursors / `pg-cursor`

Open boredland opened this issue 1 month ago • 5 comments

When using pg-cursor with PGlite:

import { PGlite } from '@electric-sql/pglite';
import Cursor from 'pg-cursor';

const db = new PGlite();
const cursor = db.query(new Cursor('SELECT * FROM large_table'));
await cursor.read(100);

It fails because PGlite doesn't support the extended query protocol features needed for cursors, I think (TypeError: chunkReader.read is not a function, The "src" argument must be of type string. Received an instance of Cursor)

Use case

I'm trying to use PGlite for testing, while production uses PostgreSQL with cursor-based streaming (via pg-cursor).

Currently, I have to use a real PostgreSQL instance for tests that involve streaming, which defeats the purpose of using PGlite.

To Reproduce

  • PGlite version: 0.3.11
  • node version: v24.8.1

Expected behavior

Cursors should work similar to how they work with node-postgres + PostgreSQL, allowing:

  • Server-side cursors via DECLARE CURSOR
  • Portal-based result fetching with row limits
  • Batch processing of large datasets

Additional context

Maybe related to #223 (prepared statements) - both require extended query protocol support (Parse, Bind, Execute with row limits).

boredland avatar Oct 31 '25 12:10 boredland

We have some ideas around this that don't require an additional package. One of them is to extend the API to provide a callback that would be invoked on each new result received from the database. Would this cover your use case?

tdrz avatar Nov 04 '25 12:11 tdrz

Would this cover your use case?

That depends on if it works on both, postgres and pglite.

boredland avatar Nov 04 '25 13:11 boredland

That depends on if it works on both, postgres and pglite.

But your example is only PGlite specific:

import { PGlite } from '@electric-sql/pglite';
import Cursor from 'pg-cursor';

const db = new PGlite();
const cursor = db.query(new Cursor('SELECT * FROM large_table'));
await cursor.read(100);

tdrz avatar Nov 04 '25 14:11 tdrz

Yes, but as I wrote:

I'm trying to use PGlite for testing, while production uses PostgreSQL with cursor-based streaming (via pg-cursor).

So if this would end up with a solution that isn't compatible with pg & pg-cursors, pg-lite wouldn't work as drop-in replacement during tests, no?

boredland avatar Nov 04 '25 20:11 boredland

We have some ideas around this that don't require an additional package. One of them is to extend the API to provide a callback that would be invoked on each new result received from the database. Would this cover your use case?

It would be a great feature, I'm waiting for it to try switching from WASM SQlite... All options would be great:

  • Callback optional parameter to options for the querymethod... BUT when this option is passed, the result should be Promise<void>to avoir a huge memory occupancy when returning large result sets (the main purpose of cursors). It could be a little confusing (and the TS types are tricky to write 😅)
  • a cursor method, with the same parameters as query returning a pg-cursor like API
  • a stream/rows method returning an AsyncGenerator for great DX :for await (const line of rows('SELECT.....')) {...}. This would be IMOO the perfect API if we could also have a chunk method (with a chunkSize parameter)... but it's always possible to create it by ourself if we have the stream method 😇

my-lalex avatar Nov 07 '25 18:11 my-lalex