Gaufrette
Gaufrette copied to clipboard
Handling large files
I am having the situation that we would have files of a few gigabytes and would like to use Gaufrette to have an abstraction, so we can be flexible in choosing our storage.
However, I can't seem to find a way to handle those without spending way too much memory for storing the content itself. Gaufrette seem to lack file append functionality or moving objects between filesystems (i.e local to ftp).
What is the recommended approach for handling large files?
:+1:
Useful here too.
:+1:
The only correct approach would be to pipe one stream in another, i.e to buffer chunks of data, without storing it all in memory, but what I say is obvious and only theoretical.
I have no idea how it is implemented in gaufrette, but I think there is a way to get a file descriptor out of a gaufrette file. EDIT: you can use the StreamWrapper feature to create file descriptors like the below example: https://github.com/KnpLabs/Gaufrette#streaming-files
Then, use the typical while ($chunk = fread($sourceFd, 1024)) { fwrite($targetFd, $chunk); }
Edit: you can also have a Stream object from filesystem:
$stream = $fileSystem->createStream($key);
With this $stream, you can then ->read() and ->write();
There's a bundle doing this. It would be nice though if chunk saving files was handled by the GaufretteBundle so we don't have to add another layer on this.
Do you have a complete example of using streams to chunk save files, @docteurklein?
unfortunately no, I don't.
Ended up not using the bundle as I can chunk save file myself (as you describe in your comment above). Would be nice with a chunk save method for the local adapter (and other off course) though.
🏓