rn-fetch-blob icon indicating copy to clipboard operation
rn-fetch-blob copied to clipboard

read chunk for large files, 250mb +

Open rexjrs opened this issue 6 years ago • 11 comments

Is there a way to read chunks of an asset using an start/end offset? This is a must functionality when handling large files.

rexjrs avatar Jun 19 '18 15:06 rexjrs

Not that I am aware of. I am curious what your use case is?

Reading large files into memory in JS is something I usually try to avoid, and I simply use rn-fetch-blob for downloading, uploading, and finding the paths to files. Then I delegate the actual file handling to other components.

For example with a video I would never try to read the video file into memory, I would pass the path to the video off to a component from react-native-video to handle it in their natively implemented code, or for a PDF I would do the same but with a PDF viewing component that implements native code that can handle the large file.

Traviskn avatar Jul 11 '18 21:07 Traviskn

Some uploading APIs can specify an amount of bytes that we are uploading. E.G, if a file is 10mb. We might perform 10 requests. Each one at 1mb of data. That allows us to create an uploader on an app that can resume from its last checkpoint say if the user's wifi fails or the user kills the app. When they open it back up they can just continue uploading.

With reading files using a start/end offset, we can make sure we are only bringing in 1mb of data into the memory at a time.

rexjrs avatar Jul 12 '18 19:07 rexjrs

I'm also looking for similar functionality.

bulats avatar Aug 10 '18 14:08 bulats

@rexjrs Btw, if you still need this you can use RNFetchBlob.fs.slice(path_to_source path_to_write, firstByte, lastByte);

bulats avatar Aug 17 '18 11:08 bulats

I have tried loading a large data from the server (~70mb). If I load it directly to the UI, it gives an out of memory error. So I cached it and read it into the file system. But it gives an out of memory too. I am looking for something which would page it internally as it is a JSON array.

dkaushik95 avatar Oct 18 '18 13:10 dkaushik95

On further investigation, I realized that it is not a good idea to keep such large files in JS memory and it will cause out of memory errors and the app will crash. A good fix can be to use a database like (RealmDB or WatermelonDB) which processes large data natively and gives us enough data to work with.

dkaushik95 avatar Oct 31 '18 00:10 dkaushik95

Can somebody show righ implemantation this task in ReactNative code? How to download large files more than 50mb without out of memory alerts?

ihusak avatar Nov 09 '18 09:11 ihusak

.slice() is promising! Is there any chance we could get an equivalent for writing chunks to a file at a given offset?

This would be useful so that I could download data from a P2P network and store the chunks locally as they're downloaded. https://datproject.org/

RangerMauve avatar Mar 23 '19 22:03 RangerMauve

I suppose there is no news on this ?

jonathangreco avatar Aug 29 '19 12:08 jonathangreco

One use case would be to decrypt downloaded files on the fly in blocksize chunks. What do you think?

pke avatar Feb 20 '21 08:02 pke

This is an important feature. Any plans for this feature?

ainnotate avatar Mar 03 '24 14:03 ainnotate