playwright-python
playwright-python copied to clipboard
[BUG] Cookie header ignored when using proxy
Context:
- Playwright Version: 1.24.1
- Operating System: Linux
- Python version: 3.9.6
- Browser: Chromium, Firefox, WebKit
Code Snippet
import asyncio
from playwright.async_api import async_playwright
async def main():
async with async_playwright() as p:
for browser_type in [p.chromium, p.firefox, p.webkit]:
browser = await browser_type.launch(
proxy={
# on a separate terminal:
# ./mitmproxy --proxyauth "user:pass"
"server": "http://127.0.0.1:8080",
"username": "user",
"password": "pass",
}
)
page = await browser.new_page()
await page.route("**", set_cookie)
await page.goto("http://httpbin.org/headers")
print("*" * 100)
print("browser:", browser_type.name)
print(await page.content())
await browser.close()
async def set_cookie(route, request) -> None:
headers = await request.all_headers()
headers["accept"] = "application/json"
headers["cookie"] = "foo=bar"
await route.continue_(headers=headers)
if __name__ == "__main__":
asyncio.run(main())
Describe the bug
Cookies are not sent to the target site when using a proxy. The above snippet produces the following output (note that other headers seem to be unaffected, like accept=application/json
which is set correctly):
****************************************************************************************************
browser: chromium
<html><head><meta name="color-scheme" content="light dark"></head><body><pre style="word-wrap: break-word; white-space: pre-wrap;">{
"headers": {
"Accept": "application/json",
"Accept-Encoding": "gzip, deflate",
"Cache-Control": "no-cache",
"Host": "httpbin.org",
"Pragma": "no-cache",
"Proxy-Connection": "keep-alive",
"Upgrade-Insecure-Requests": "1",
"User-Agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) HeadlessChrome/104.0.5112.48 Safari/537.36",
"X-Amzn-Trace-Id": "Root=1-62f46cfb-2c1803c62aba39a03e860127"
}
}
</pre></body></html>
****************************************************************************************************
browser: firefox
<html><head><link rel="stylesheet" href="resource://content-accessible/plaintext.css"></head><body><pre>{
"headers": {
"Accept": "application/json",
"Accept-Encoding": "gzip, deflate",
"Accept-Language": "en-US,en;q=0.5",
"Host": "httpbin.org",
"Upgrade-Insecure-Requests": "1",
"User-Agent": "Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101 Firefox/102.0",
"X-Amzn-Trace-Id": "Root=1-62f46cff-1ce3af1c6664372b3645b7da"
}
}
</pre></body></html>
****************************************************************************************************
browser: webkit
<html><head></head><body><pre style="word-wrap: break-word; white-space: pre-wrap;">{
"headers": {
"Accept": "application/json",
"Accept-Encoding": "gzip, deflate",
"Accept-Language": "en-US",
"Host": "httpbin.org",
"Upgrade-Insecure-Requests": "1",
"User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.0 Safari/605.1.15",
"X-Amzn-Trace-Id": "Root=1-62f46d02-20e4151c3588901e0ec98ff6"
}
}
</pre></body></html>
Commenting out the proxy
keyword when launching the browser causes the Cookie
header to be sent by Chromium and Firefox, but not by Webkit.
On the other hand, setting a cookie in the context right after creating the page causes the cookie to appear:
page = await browser.new_page()
await page.context.add_cookies(
[
{
"name": "key",
"value": "value",
"domain": "httpbin.org",
"path": "/",
}
]
)
I realize I'm having to explicitly set the domain in this case, so it makes me wonder: is this expected? I suppose it could be a kind of security feature to avoid leaking cookies to incorrect domains.
Thanks in advance for your time.