Patrick Weygand
Patrick Weygand
The limit is on top level URL objects. Not cumulative links. However it would be nice to have a file size enforcer that split the files automatically when it exceeds...
@huntharo that would be awesome
Thanks for your PR. I've merged and released it. It looks like validator's console.warn still needs to be handled. I'll take a look at that tomorrow.
@huntharo I think you need to call callback with the error and handle there. Its also very possible this very scenario is built into streams and we just haven't seen...
Some notes/thoughts. note 1: The transform is in object mode if I recall, this will impact some of the docs you read thought 2: I wonder if it might be...
I saw this first so I'm replying to some of your thoughts in the bug thread: I think trying to anticipate hitting the limit is a flawed approach as at...
@huntharo ok. I'm going to give a shot at this myself if for no other reason than to better understand the problem space. I'll set a deadline for Saturday.
I got a chance to dig into this tonight and in trying my own approach ended up at something very similar to yours. So I'll just tag my modifications on...
Good catches @huntharo! And RE your chained calls, I would think it'd still call them serially and add up the total correctly even if but might as well write a...