VichUploaderBundle icon indicating copy to clipboard operation
VichUploaderBundle copied to clipboard

Passing resource handle to Storage (Gaufrette) instead of reading file in memory

Open MonsieurLanza opened this issue 7 years ago • 2 comments

Hello,

We have some big (several Go) files to upload, and we cannot afford to upload them by reading the whole file in memory using file_get_contents(). For instance, some Gaufrette (if not all, not sure about this) adapters do accept resource handles as arguments of their write() method, and I submitted a PR at Gaufrette to allow AWS S3 multipart upload, which reads the file in smaller chunks from a handle.

As for now we use a modified version of the VichUploaderBundle with a raw modification of GaufretteStorage.php (see https://github.com/MonsieurLanza/VichUploaderBundle/blob/master/Storage/GaufretteStorage.php#L48), but we would like to be able use the standard distribution.

I was thinking about adding a configuration option, but before I go on, I'd like to have your opinion on the best way to achieve this.

Thanks.

MonsieurLanza avatar Jul 04 '17 12:07 MonsieurLanza

If you think this could be useful for someone else, a PR will be very welcome. Otherwise, you don't need to use your own copy of this bundle, just define your storage and use it just here. See the docs for more info

garak avatar Jul 04 '17 12:07 garak

It would allow to bypass the 5Go limit on AWS S3 (and avoid memory exhaustion). I do not know if this is a common issue, but as is, it may fail on other Gaufrette adapters, hence I did not send a PR, it would need more work.

Thanks for the link.

MonsieurLanza avatar Jul 04 '17 12:07 MonsieurLanza