flowpipeline icon indicating copy to clipboard operation
flowpipeline copied to clipboard

stdin segment fails on additional/no line breaks in json

Open Mynacol opened this issue 1 year ago • 1 comments

I modified a json export with another tool (jq) and was unable to read it back into flowpipeline. In the first attempt, jq pretty-printed the json with newlines between each json field (not a full flowpipeline object), which lead to a failure decoding the json. In the second attempt I instructed jq to "uglify" the json, which removes all newlines. But then the import fails because the file contains one very long line, leading to a "buffer too big" error.

Both errors are caused by the input segment using a scanner and always processing the data line-by-line. This however is only successful for a subset of valid json files.

Mynacol avatar Nov 19 '24 09:11 Mynacol

I think that this design was chosen to prevent the "buffer too big" . By reading the json line by lines only part of it has to be loaded at a time. This allows the use of bigger files. I'm not sure how this behavior should be changed.

ynHuber avatar Apr 14 '25 12:04 ynHuber