cli icon indicating copy to clipboard operation
cli copied to clipboard

[Enhancement] Provide support for streaming uploads and progress bar

Open sandeep048 opened this issue 8 years ago • 11 comments

Uploading files using input redirection fails for large files. I presume the whole file is getting read into memory first.

Last time I checked requests library can perform streaming uploads. http://docs.python-requests.org/en/master/user/advanced/#streaming-uploads

It would be nice to integrate this logic into httpie client may be using a command line switch. The file upload should show a progress bar (ETA) as well.

Please let me know if this functionality already exists. I can have a look at the code if required.

sandeep048 avatar Mar 24 '16 17:03 sandeep048

Please define how the upload fails. What is the exact message?

sigmavirus24 avatar Mar 24 '16 19:03 sigmavirus24

Very easy to reproduce. This test is on a 512MB machine with 1GB swap.

[Fri Mar 25 10:50:35] sandeep@stream:~⟫ http --version 0.9.3 [Fri Mar 25 10:41:39] sandeep@stream:~⟫ sudo fallocate -l 1G test.log [Fri Mar 25 10:41:51] sandeep@stream:~⟫ ls -alh test.log -rw-r--r-- 1 root root 1.0G Mar 25 10:41 test.log [Fri Mar 25 10:41:55] sandeep@stream:~⟫ [Fri Mar 25 10:42:46] sandeep@stream:~⟫ http PUT https://transfer.sh/test.log < test.log

http: error: MemoryError:

sandeep048 avatar Mar 25 '16 14:03 sandeep048

@sandeep048

I'm actually working on this (streamed uploads for redirected input) these days. Stay tuned :sunglasses:

(Relevant kevin1024/pytest-httpbin#33 & kennethreitz/requests#3035)

jkbrzt avatar Mar 26 '16 02:03 jkbrzt

Any chance for an update ?

macnibblet avatar Mar 02 '17 21:03 macnibblet

@macnibblet yes, I'm looking into it these days.

Considering what the behaviour should be. I believe curl enables chunked uploads when the Transfer-Encoding: chunked header is specified.

HTTPie could do same: keep the current default behaviour (buffered uploads) and switch to chunked when the user sets Transfer-Encoding: chunked.

Perhaps for piped stdin it could switch to streaming automatically. That would probably be sensible. On the other hand, it'd break backwards compatibility. Also, it would add another mode which makes the behaviour harder to understand.

jkbrzt avatar Mar 02 '17 21:03 jkbrzt

Same error with a form POST (without redirection input). Sending a 200M binary file causes memory error in Debian virtual box with 250M RAM.

Using (as in docs): Item Type: Form File Fields field@/dir/file Description: Only available with --form, -f. For example screenshot@~/Pictures/img.png. The presence of a file field results in a multipart/form-data request.

Command line: $ http -f POST http://10.0.2.2:8000/uploads/ [email protected]

Error: http: error: MemoryError:

caco13 avatar Mar 06 '17 19:03 caco13

I wrote streaming and chunked upload support for Requester, an HTTP client I built for Sublime Text. It's also built on top of Requests.

In Requests, streaming uploads are really simple. You just a pass a file handle to the data arg of requests.

Chunked uploads happen automatically if you pass a generator to the data arg. Requests will set the "Transfer-Encoding": "chunked" header on the request if you do so. Passing a generator provides the same memory benefits as a streaming upload (you don't have to read the entire file into memory), and also allows you to run code each time your generator yields another chunk of data, which allows you to display a progress bar.

This is how I solved this problem in Requester. I wrote a function called read_in_chunks that accepts a handle_read callback, and each time the function reads another chunk, it passes the chunk count and chunk size to handle_read. handle_read goes ahead and displays a status bar.

Not all servers accept chunked uploads. From what I understand, requests-toolbelt lets you invoke a function on each iteration of streaming uploads as well, but I didn't want to pull in another dependency to make this work.

kylebebak avatar Aug 14 '17 16:08 kylebebak

What's the latest here? I'm also seeing "http: error: Request timed out (30s)." for large files

noahcoad avatar Oct 20 '19 05:10 noahcoad

Any news?

dausruddin avatar Aug 10 '20 01:08 dausruddin

@darshanime streamed uploads have already been implemented in master, so you should have no issues with large files (it will be released with v2.3.0). Upload progress bar coming soon.

jkbrzt avatar Sep 28 '20 10:09 jkbrzt

@dausruddin 🔝

darshanime avatar Sep 28 '20 11:09 darshanime