fossa
fossa copied to clipboard
HTTP Client API: High memory consumption when downloading big files
I tried sample http client app to download some big files (about 1 GB) on desktop. While tracking memory usage, I noticed it is constantly growing until the file is downloaded. This happens because the library downloads the entire response into memory first, before NS_HTTP_REPLY is arrived.
It would be nice to extend the HTTP client API of the library to support downloading big HTTP responses as data chunks to prevent eating all available memory on a device...
In that case NS_RECV could be handled, and streaming read implemented. Upon NS_RECV event, a data chunk could be saved into file, and removed from the iobuf.
That way, multipart raw POST could be saved into the regular file, and a separate function could be made to extract file name and file data from that raw file.
@cpq Thank you for you reply! I'm looking for a way to handle big responses "on the fly" so storying everything into file first then re-read it is not the preferred approach.
If I understand correctly I need handle NS_RECV event to parse received buffer to catch http headers to recognise when the response body begins and clean received data buffer after each iteration.... Probably, some additional events could be raised by the library, e.g. NS_HTTP_HEADER and NS_HTTP_BODY or I'm still wrong?
That's correct. The way you describe is pretty complex to implement, cause handler needs to be a state machine. I was describing a simpler way, when handler just saves raw HTTP response into a file, and then parsing is done over the file, something like that:
case NS_RECV:
save_into_file(nc->recv_iobuf, my_file);
break;
case NS_NS_HTTP_REPLY:
parse_and_delete_file(my_file);
break;
OK. Your sample looks clear and it seems to be I need to implement that state machine on top of the library so the state will be managed by connection instance. That is OK.