eliasdb
eliasdb copied to clipboard
Out of memory when loading a large JSON file.
I have a 941MB json file I would like to import into eliasdb, but I get an out of error crash. this is because the importer tries to load the data wholesale in stead of incrementally. A way to do incremental loading from a single json file would be great.
You are correct. Let me think about that ...
I think you could switch to a streaming parser for JSON, maybe something like this, if you don't mind the dependency:
https://github.com/francoispqt/gojay
Yeah I tend to agree about the JSON parser
Consider this one: github.com/buger/jsonparser
https://golangrepo.com/repo/buger-jsonparser-go-json
there are others also that achieve high throughput but support code gen as well
they also compile to wasm which may be useful overall
I think what would also work here is newline delimited JSON data like BigQuery is using ...