Transducers.jl
Transducers.jl copied to clipboard
How to handle streaming input data out of memory?
I have over 100 million streaming data sources, It needs to be streamed transducer.
The input data is out of memory, and is stream data such as from kafka. how can i do this: data |> trancducer ??
- eachline(input_json) |> Map(prase_json) |> op_1 |> collect
- kafka(topic) |> Map(prase_json) |> op_1 |> collect
have some problem:
- Transducers not support parallelism for eachline
- Transducers have not support kafka stream input