rn-fetch-blob
rn-fetch-blob copied to clipboard
read chunk for large files, 250mb +
Is there a way to read chunks of an asset using an start/end offset? This is a must functionality when handling large files.
Not that I am aware of. I am curious what your use case is?
Reading large files into memory in JS is something I usually try to avoid, and I simply use rn-fetch-blob for downloading, uploading, and finding the paths to files. Then I delegate the actual file handling to other components.
For example with a video I would never try to read the video file into memory, I would pass the path to the video off to a component from react-native-video to handle it in their natively implemented code, or for a PDF I would do the same but with a PDF viewing component that implements native code that can handle the large file.
Some uploading APIs can specify an amount of bytes that we are uploading. E.G, if a file is 10mb. We might perform 10 requests. Each one at 1mb of data. That allows us to create an uploader on an app that can resume from its last checkpoint say if the user's wifi fails or the user kills the app. When they open it back up they can just continue uploading.
With reading files using a start/end offset, we can make sure we are only bringing in 1mb of data into the memory at a time.
I'm also looking for similar functionality.
@rexjrs Btw, if you still need this you can use RNFetchBlob.fs.slice(path_to_source path_to_write, firstByte, lastByte);
I have tried loading a large data from the server (~70mb). If I load it directly to the UI, it gives an out of memory error. So I cached it and read it into the file system. But it gives an out of memory too. I am looking for something which would page it internally as it is a JSON array.
On further investigation, I realized that it is not a good idea to keep such large files in JS memory and it will cause out of memory errors and the app will crash. A good fix can be to use a database like (RealmDB or WatermelonDB) which processes large data natively and gives us enough data to work with.
Can somebody show righ implemantation this task in ReactNative code? How to download large files more than 50mb without out of memory alerts?
.slice()
is promising! Is there any chance we could get an equivalent for writing chunks to a file at a given offset?
This would be useful so that I could download data from a P2P network and store the chunks locally as they're downloaded. https://datproject.org/
I suppose there is no news on this ?
One use case would be to decrypt downloaded files on the fly in blocksize chunks. What do you think?
This is an important feature. Any plans for this feature?