Behrad Zari
Behrad Zari
I am running multiple processes of dataflows, e.g. `dataflows daemon core` `dataflows daemon redis-events` `dataflows daemon listeners2` But I can not define how much of each initiator to run in...
It seems every.item is not locally bound for each iteration of `$every`, I have an every task in which the sub-flow may pause or wait... (consider a long-time sub-task in...
How can I use grep to find line's containing an specific flow id in my dataflo.ws log file? I was unable to use grep since flow ids are colored!
How can dataflow be tailored to support a simple flow management. to check if some flows are leaking, blocked or monitor them in production environment?
How can I access and stop a running workflow !?
How can I define & reuse a workflow, in etc/project?
Can you comment on some basic tasks usage? 1) What does the "every" task? 2) What do "$origin", "$scope", ... mean? 3) Do we have a "startup" initiator which starts...
What about a caching option to cache the `retained` hash key inside memory @mcollina ? This way `createRetainedStream` won't touch redis on each CONNECT, I believe this really improves on...