thinky
thinky copied to clipboard
Question: how to steam results from read query on big table?
Question
I have table with few millions of entries in in. I want to process every one of them.
So, i initialized a Thinky Model binding for this table (let us say, Item).
And i want query like this
const streamOfItems = Item.filter({state:"good"}).toStream();
streamOfItems
.on('data', function(item) { returnn doProcess(item) });
.on('end', function() { console.log('All items processed!'); }
how it is possible?
How can i make something similiar for https://www.rethinkdb.com/api/javascript/#each_async with ThinkyORM?
i'm using this example http://justonepixel.com/thinky/documentation/api/query/#execute
Item.map(r.row("id")).execute().then(function(cursor) {
cursor.each(function(err, itemId) {
console.log(itemId);
});
});
but i recieve this error:
Unhandled rejection TypeError: cursor.each is not a function at /home/projects/vodolaz095/thunder/datawarehouse/improveEmails.js:10:10 at tryCatcher (/home/projects/vodolaz095/thunder/libiva/node_modules/thinky/node_modules/bluebird/js/main/util.js:26:23) at Promise._settlePromiseFromHandler (/home/projects/vodolaz095/thunder/libiva/node_modules/thinky/node_modules/bluebird/js/main/promise.js:507:31) at Promise._settlePromiseAt (/home/projects/vodolaz095/thunder/libiva/node_modules/thinky/node_modules/bluebird/js/main/promise.js:581:18) at Async._drainQueue (/home/projects/vodolaz095/thunder/libiva/node_modules/thinky/node_modules/bluebird/js/main/async.js:128:12) at Async._drainQueues (/home/projects/vodolaz095/thunder/libiva/node_modules/thinky/node_modules/bluebird/js/main/async.js:133:10) at Immediate.Async.drainQueues (/home/projects/vodolaz095/thunder/libiva/node_modules/thinky/node_modules/bluebird/js/main/async.js:15:14) at runCallback (timers.js:672:20) at tryOnImmediate (timers.js:645:5) at processImmediate [as _immediateCallback] (timers.js:617:5)
Streams don't work with hinky.
I think you want execute({cursor: true})
in your last example
@neumino Thanks, I had the same question, and the solution worked for me :-)
It would be great to mention this aspect in the docs
If you point me to the repo with the docs, and I'll be happy to document all pitfalls on my learning path and send a PR with the improvements to the documentation.
They are in the brand gh-pages