[file_packager] split data files when file size exceeds ArrayBuffer limit
Addresses #24691
Currently [ArrayBuffer cannot be larger than 2046 on chrome. So bundling a large amount of files using file_packager.py the .data will not be allowed to load. Breaking up files into multiple data files by passes this issue.
error while handling : http://localhost:3040/Gtest.data Error: Unexpected error while handling : http://localhost:3040/Gtest.data
RangeError: Array buffer allocation failed
@sbc100 Is this aligned with what you were thinking?
Can we add some tests for this? What are the specific limits you are running into? Are those limits not fixed? i.e. does it need to be configurable?
I wanted to make sure i was on the right track before adding tests. The limits are described in conversation in the issue i mentioned. We can hard code the limit to 2Gi as a likely guess at a true limit. If that is what you prefer its simpler from testing perspective . Seemed more flexible to make it configurable to anyone calling the utility, but i get the simplicity side of it
This does seem like a reasonable approach.
I'm still a little fuzzy on exactly why and when this might be needed in the real world, I think I need to go re-read our original discussion, but I also think including more information in the PR (i.e. in the description, or in comments) would be good.
This does seem like a reasonable approach.
I'm still a little fuzzy on exactly why and when this might be needed in the real world, I think I need to go re-read our original discussion, but I also think including more information in the PR (i.e. in the description, or in comments) would be good.
Updated the description, based on that if you want me to hard code the limit into the package i can take that approach
@sbc100 Going to look at adding tests for this. Do you have an recommendations? Would you like me to generate temporary files (s) so that i can reach the 2Gi limit?
@sbc100 This should be ready for review, not sure about the test failure if they are flaky or if its related to my change. I don't see them fail locally.
Peculiarly I find that in Chrome, the max ArrayBuffer limit is not 2GB (2048MB), but 2046MB:
Firefox supports up to 16GB ArrayBuffers:
In node.js I get up to 4GB ArrayBuffers:
What is your target environment?
@juj Thanks! My target is chrome, i incorrectly associated in the ArrayBuffer documentation [Number.MAX_SAFE_INTEGER](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer/ArrayBuffer#exceptions) to MAX_INT`. I didn't specifically test the exact limit, rather was just trying to follow the documentation but clearly misinterpreted it.
Looks like Chrome has the lowest limit, i will change to use that limit.
Safari appears to be 4Gi
@juj @sbc100 Anything else on this?