Text parsing buffer length check
I'm getting "Max buffer length exceeded" errors while parsing large text nodes. Looking at the code, this line seems to indicate that the parser should be able to handle strings that are larger than MAX_BUFFER_LENGTH. But closeText() doesn't exist. In practice, it never gets called because the buffer is named "textNode" instead of "text" (code).
I can work around the issue by increasing MAX_BUFFER_LENGTH, but being able to deal with arbitrary string sizes would be nicer, of course.
I also have this same (similar) question. I even changed the code to "textNode", but the function closeText doesn't even exist...
I have two users of dexie-export-import that bump on this issue too. Could you give some guidance on how to resolve it? Is the only solution to increase MAX_BUFFER_LENGTH to a greater value than 64k or could it be done more streamingly? I suppose we might need to support hundreds of megabytes in our case as people might store large blobs this way.
@dfahlander If the blobs are stored in a string, then I think that with the current API, your only option is to increase MAX_BUFFER_LENGTH. The reason is that the API has a callback 'onvalue' which has a string as argument containing all the data at once. If this is not an option, you might want to change the API to allow streaming values
Thanks! I did increase it to 10MB for now. I see the problem. In our code we also assume that each single row could always fit into memory. We wouldn't benefit from streaming within a row internally. I might increase the limit even more if needed. Using a fork of clarinet right now. I could PR it if there's an interest to incorporate the extended limit into the main package.