Support for CF Workers & Deno (Browser support)
This is related to issue #66
Since the native streaming APIs still can't do this, it would be super-useful!
Both CF workers and Deno are limited to 128MB per request, being able to stream-parse the JSON would be increasingly be beneficial, could lower CPU-time and allow for parsing much larger loads of JSON.
Could you give me more information on use cases?
As far as I can tell stream-json is used mostly as part of utilities or desktop applications to process huge files, which involves reading and/or producing them. I can see Deno as an alternative platform to run such utilities. Running it in a browser involves sending huge files over the network. I am not sure it is practical.
The other problem is that the support for streams in browsers is spotty: https://caniuse.com/streams
Last time I checked Deno (one year ago or so) it didn't have functional streams in the standard library. It looks like they have something now: https://deno.land/[email protected]/io but it is hard to assess if it is suitable for our needs.
The important part here is not a presence of streams, but an ability to create custom streams.
I can see switching stream-json to the ES6 modules using import rather than require(), but while, technically, it will be interoperable with Deno and modern browsers, we still need to account for differences between Stream API on Node and other platforms.
What do you think?
About 2 years ago I built an AWS Lambda that processed a large CSV, about 35MB in size, with node.js. It took quite long to do this, because loading the file into a variable parses the entire content even if it's just a string, so I read about streaming using createReadStream, when I changed to that it reduced the parsing to less than half of the time. I'm assuming the same applies here with the CPU-load, processing from a large variable is much heavier than processing-while-streaming, in this case we don't even need to load it into RAM but can discard the data when the processing is completed.
On both Cloudflare Workers and Deno, they charge per request but also for the CPU-time that's consumed. You can imagine that being able to cut CPU-time in half by stream-processing, would be valuable in this case.
Another use case for incremental/stream parsing of JSON in browsers is not large payloads, but payloads delivered over time, where a UI could update as chunks are delivered. Consider a server aggregating data from remote dependencies for a client in a single HTTP call -- incremental delivery of the payload allows the server to avoid the equivalent of Promise.all([every, remote, dep].
I am waiting for better browser support and for more mature support of streams by Deno.
FWIW, I have a use case similar to @justinbayctxs (incremental UI updates as JSON arrives). Now that streams are supported in most modern browsers (caniuse streams), I am planning to implement incremental UI updates somehow to improve user experience. My other strategy if JSON streaming won't work is something like multipart/mixed (each JSON object sent in its own multipart chunk). JSON streaming would be cleaner though, and would make an API backwards compatibly with clients that aren't aware of streaming.
I think client parsing for UX optimization is an extremely common use case. Would love to see the support added.
For anyone interested in streaming JSON in a browser / deno please also consider https://github.com/juanjoDiaz/streamparser-json
There's also https://github.com/xtao-org/jsonhilo now, so I'll close this ticket.