Feature - Allow to limit the amount of data retrieved
I'm not sure if this is already possible, but I was looking for a way to limit the amount of data retrieved with a Request::get(). Such option would be useful to perform "test calls" to sites, or to retrieve only the headers (and a chunk of the body) from sites which do not accept a simple "HEADER" request.
The library is built upon cURL. AFAIK the php bindings do not have support for this (curl_exec is all or nothing). This would probably only be possible if we opened up something like a socket stream.
It's actually possible to retrieve only a specific amount of data with CURL. It can be done by passing a writer function to CURL, which returns zero when the desired amount of data has been retrieved. This triggers an exception, though, which has to be intercepted and handled (currently, Httpful simply stops with an error when the exception occurs).
--- Original Message ---
From: "Nate Good" [email protected] Sent: 19 July 2014 16:04 To: "nategood/httpful" [email protected] Cc: "Diego" [email protected] Subject: Re: [httpful] Feature - Allow to limit the amount of data retrieved (#137)
The library is built upon cURL. AFAIK the php bindings do not have support for this (curl_exec is all or nothing). This would probably only be possible if we opened up something like a socket stream.
Reply to this email directly or view it on GitHub: https://github.com/nategood/httpful/issues/137#issuecomment-49511800
Interesting. Thanks for sharing. I'm open to a pull request. You may need to add a "trap exceptions" option to Request to get around the error handling issue you bring up.
I will see if I can make it work. In case I don't make it in a reasonable time, here is a discussion about the various methods to achieve the result, for reference (perhaps someone else might also be interested in it): StackOverflow - How to set a maximum size limit to php curl downloads.
Feature implemented, pull request on the way.