goodboy
goodboy
This probably also ties into #206.
@ali1234 I was thinking that this is really interesting idea for my own purposes as well since I'm working towards building out a stream based analytics system for financial data....
Just quickly addressing the pitfalls section of your current solution: >My implementation uses multiprocessing and has some weak points and ugly things in it Responding in order: 1. in `tractor`...
@ali1234 sweet, yes this looks much easier to understand now. The reactive stuff becomes useful when you want to do forking and merging of streams (obviously not used in your...
Though not for the same type of work (this is targetting IO bound) the [`asyncio-buffered-pipeline`](https://github.com/michalc/asyncio-buffered-pipeline) has a "composition" api for doing something similar on a single process.
@ali1234 sorry to have left this for so long but we do indeed need something like this for `piker`, and it will likely be implemented using a shared memory system...
@ali1234 nice. Link to the new code if you don't mind (I'm sure I can also find it if need be). I will definitely keep you in the loop. The...
As bump for this issue after discussion in chat. - literal [example code](https://github.com/ali1234/vhs-teletext/blob/master/teletext/mp.py#L291) to get working with tractor - toy example to start with from @ali1234 : ```python import time...
Linking to @richardsheridan's [`map_concurrently_in_subthread_trio.py`](https://gist.github.com/richardsheridan/42d99cbcfcc1d77bc61890b2fae4bfaa) which *may* allow for keeping a sync api up front.
@ali1234 ur wish is me command: ```python from itertools import count, islice import time import timeit from multiprocessing import Pool from functools import partial already_called = False def work(it, a,...