LokiDB icon indicating copy to clipboard operation
LokiDB copied to clipboard

Batch larger serialisations

Open elmarti opened this issue 2 years ago • 2 comments

Current behavior

Node has an internal string length limit, meaning that any data larger than 2GB. will fail with a string length error, when we run JSON.parse here https://github.com/LokiJS-Forge/LokiDB/blob/master/packages/loki/src/loki.ts#L687

Expected behavior

Any size of data will be eventually serialized without error

What is the motivation / use case for changing the behavior?

Allow the batch insert of large quantities of data greater than 2GB

Environment


LokiDB version: 2.1.0

Node v14.17.0


Others:
- We could add an explicit batch length like knex does in their `batchInsert` method https://knexjs.org/
- We could internal batch inserts with an arbitrary value than can be overridden in DB config e.g batches of 10,000

elmarti avatar Sep 22 '21 12:09 elmarti

Am I hitting this issue here? https://github.com/nuxt/content/issues/947 would be great if that got fixed!

lustremedia avatar Jan 05 '22 13:01 lustremedia

@lustremedia I attempted, but it's quite fundamental to how Loki works so it would be pretty significant, also note that this is somewhat unmaintained https://github.com/LokiJS-Forge/LokiDB/issues/190#issuecomment-901017586 - I can recommend the successor that I wrote that doesn't have this issue - main caveat is lack of indexing at this point (in the near future), although it performs pretty well without it https://github.com/elmarti/camadb

elmarti avatar Jan 09 '22 13:01 elmarti