proxy.py
proxy.py copied to clipboard
Add support for Expect: 100-continue
Is your feature request related to a problem? Please describe.
Following with curl
currently results in intermittent Done waiting for 100-continue
.
$ curl -v \
-x localhost:8899 \
-F "[email protected]" \
http://httpbin.org/post
* Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 8899 (#0)
> POST http://httpbin.org/post HTTP/1.1
> Host: httpbin.org
> User-Agent: curl/7.54.0
> Accept: */*
> Proxy-Connection: Keep-Alive
> Content-Length: 58398
> Expect: 100-continue
> Content-Type: multipart/form-data; boundary=------------------------cd14da735706a952
>
* Done waiting for 100-continue
< HTTP/1.1 200 OK
< Access-Control-Allow-Credentials: true
< Access-Control-Allow-Origin: *
< Content-Type: application/json
< Date: Fri, 11 Oct 2019 00:05:54 GMT
< Referrer-Policy: no-referrer-when-downgrade
< Server: nginx
< X-Content-Type-Options: nosniff
< X-Frame-Options: DENY
< X-XSS-Protection: 1; mode=block
< Content-Length: 78041
< Connection: keep-alive
<
{
"args": {},
"data": "",
"files": {
"image": "data:application/octet-stream;base64,
... [ redacted ] ...
Describe the solution you'd like
- Dispatch headers to upstream, expect a 100-continue response, respond back to client.
- Directly respond back 100-continue to client without consulting upstream.
Per https://www.w3.org/Protocols/rfc2616/rfc2616-sec8.html
Proxies SHOULD maintain a cache recording the HTTP version
numbers received from recently-referenced next-hop servers.
Since proxy.py
doesn't plan to maintain a cache for now and since most of the HTTP servers are now supporting HTTP/1.1, we'll respond back continue to the client without consulting upstream.
I had this issue where if I am using curl and by default it added the Expect header and posting data via proxy.py then I get an exception.
2020-01-26 23:05:09,832 - pid:28089 [E] run:415 - Exception while handling connection <socket.socket fd=12, family=AddressFamily.AF_INET, type=2049, proto=0, laddr=('192.168.1.251', 2080), raddr=('192.168.1.251', 40826)>
Traceback (most recent call last):
File "/data/proxy.py/proxy/http/handler.py", line 405, in run
teardown = self.run_once()
File "/data/proxy.py/proxy/http/handler.py", line 390, in run_once
teardown = self.handle_events(readables, writables)
File "/data/proxy.py/proxy/http/handler.py", line 199, in handle_events
teardown = plugin.read_from_descriptors(readables)
File "/data/proxy.py/proxy/http/proxy/server.py", line 152, in read_from_descriptors
self.response.parse(raw.tobytes())
File "/data/proxy.py/proxy/http/parser.py", line 167, in parse
raise NotImplementedError('Parser shouldn\'t have reached here')
NotImplementedError: Parser shouldn't have reached here
Once I removed the Expect header using "Expect:" in my headers then it worked fine. Was just a bit annoying as I couldn't see why when I was trying to replay a payload for testing it didn't work whereas the client did work. And it was because of the Expect header.
@plambrechtsen thanks for the update. Apologies for delay but looks like I never received a notification about your update (or probably I missed it somehow). Revisiting this issue because of another possibly related issue unable to upload files (greater than 100kb) when TLS interception is on
@roshanprince402 Tried file upload and I ran into similar issue when uploading file under TLS interception. Though, I received a different error than what you experienced here https://github.com/abhinavsingh/proxy.py/issues/351#issue-624083799. Among other factors, a reason for upload failure is probably due to no 100-Continue
header implementation.
Looking into it this month, hopefully we'll have a resolution soon.