node-pg-query-stream
node-pg-query-stream copied to clipboard
readable event no longer works
Version 1.1.0 broke compatibility where it no longer supports readable event.
The general rule/recommendation for supporting data vs readable is as follows:
- Both should be supported, ideally.
-
data
must be present when the data is provided in chunks (as an array of rows) -
readable
must be supported when data is provided as one-by-one item
See this question for more details: https://stackoverflow.com/questions/26174308/what-are-the-differences-between-readable-and-data-event-of-process-stdin-stream
This library provides only data in simplified form one-by-one, which means readable should be supported first of all, and data secondarily.
However, this library stopped supporting readable event since version 1.1.0
To elaborate a bit further, a complete implementation for this module would be:
- Provide
readable
event for what the module currently does for thedata
event. For this you could simply swap the event name, and you would be done. - Provide
data
event with aggregation, i.e. the data sent should be an array of rows, with the maximum length equalbatchSize
. Worst-case scenario, if you cannot implement such aggregation for whatever reasons, then you can havedata
duplicate thereadable
event, or better still, supply an array with just one row initially, so you can add proper aggregation later on without breaking compatibility.
@vitaly-t pg-query-stream's reversion to the classic readable stream style is what's breaking https://github.com/dmfay/massive-js/pull/473 too btw.
@dmfay you probably meant the other way round? Because it was previously supporting readable
event, and now it only supports the data
event.
no -- modern streams have readable
, the data
event is what makes it classic style.
@dmfay @brianc any resolution on this? Would really appreciate a fix on this. I am willing to help and send a PR if you can point me in the right direction. https://github.com/vitaly-t/pg-promise/issues/502 needs this fix.
You want @brianc, not me :) I wound up writing a wrapper to be able to return a modern stream from Massive.
Bumped into this. Wrapping the stream like @dmfay suggested works.
const stream = require('stream')
module.exports = function toReadable(classicStream) {
const readableStream = new stream.Readable({ objectMode: true })
classicStream.on('data', (data) => {
readableStream.push(data)
})
classicStream.on('end', () => {
readableStream.push(null)
})
readableStream._read = () => {}
return readableStream
}