proxy.py
proxy.py copied to clipboard
Upgrade the packaging setup
I see that currently setup.py
is being used and called directly. This is highly discouraged and is deprecated. See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for more details.
It's quite easy to upgrade it to use PEP 517 with invocations via pypa/pip and pypa/build.
If you're open to improvements, I could try to find some time to contribute some packaging and CI/CD-related PRs.
@webknjaz Will highly appreciate PR for these, I could learn new things :). I am stuck in classic vanilla setup.py workflows and seemingly things have evolved a lot in this aspect. Thank you 👍
Woah! Among other problems, there are no wheels uploaded for the last versions on PyPI (since 2019!)
First improvement done: https://github.com/abhinavsingh/proxy.py/pull/647.
@abhinavsingh I've submitted a number of the linting setup maintainability improvements. The last one includes everything but I thought it would probably be easier/more digestible if the building blocks were their own atomic PRs:
- https://github.com/abhinavsingh/proxy.py/pull/652
- https://github.com/abhinavsingh/proxy.py/pull/653
- https://github.com/abhinavsingh/proxy.py/pull/654
- https://github.com/abhinavsingh/proxy.py/pull/655
- https://github.com/abhinavsingh/proxy.py/pull/656
- https://github.com/abhinavsingh/proxy.py/pull/657
This setup is rather strict but has a lot of excludes in the config because there's a lot of the linting violations that get revealed otherwise. I think it'd make a nice series of "good first issues" for beginner contributors to improve.
+1 on top https://github.com/abhinavsingh/proxy.py/pull/658
@abhinavsingh I wanted to ask you if you have a strong preference to have multiple testing workflows. It feels easier to have the testing jobs in the same workflow because then it's possible to create useful dependencies between them.
Another suggestion would be to make the workflow and job names shorter. Ever since GHA got introduced many projects noticed that big job names don't fit in the UI in multiple places on the website making them rather useless. From this, practice to keep them short emerged. I hope you don't mind if I try to come up with better names.
Uncommented another fixer on top: https://github.com/abhinavsingh/proxy.py/pull/661
And here's two more improvements:
- https://github.com/abhinavsingh/proxy.py/pull/662
- https://github.com/abhinavsingh/proxy.py/pull/663
I'm starting to hit merge conflicts so I'll pause for some time to give you a chance to get these merged before I add more.
@webknjaz Yes naming shouldn't be an issue. My original intention here was to surface enough context via job names. Checking on the PRs.
Yeah I see that. But currently you repeat the same context in many places. Job names show up right after the workflow names in all of the UIs. Plus adding the project name is not particularly useful since when you look at the CI pages, it's visible what repo they are in.
@abhinavsingh here are fixes for the pre-commit.ci
env:
- https://github.com/abhinavsingh/proxy.py/pull/666
- https://github.com/abhinavsingh/proxy.py/pull/667
@webknjaz Locally I ran into PyLint
socket module related errors. Amazingly, same check passes on GHA (and I guess even for you locally). So am I missing something locally here. There are variety of suggestions on StackOverflow. Any hints here?

@abhinavsingh this is interesting... are you on macOS? Locally, I run Gentoo Linux and GHA and pre-commit.ci
both use Ubuntu, I think. But yeah, pylint is known for behaving inconsistently sometimes, according to the env it runs under (including the Python version because some of its checks tend to be semi-dynamic as opposed to flake8 which is fully static). It can be a little annoying. I'll see what I can do. Worst case, we can disable the specific rule.
https://github.com/PyCQA/pylint/issues/4759#issuecomment-890564379 seems to suggest that Python shipped by homebrew
tends to be problematic. Usually, I use pyenv
(for many reasons) — it allows me not to mess with the system installation of CPython which may be patched or influence various components in my system. I specifically use the userspace installation so it all lives in my ~/.pyenv
and even if I destroy it completely, it won't affect important parts of the system. I can delete and recreate it any time + it allows me to have as many interpreter versions as I like.
Yep, I am on macOS
. Lemme give it a try locally with pyenv
. We might need to highlight this fact somewhere for macOS
contributors or may be simply relax pylint
for socket classes (which kind of is covered via mypy)
Well, we could apply inline disables for those cases. Or maybe add a transform plugin https://pylint.pycqa.org/en/latest/how_tos/transform_plugins.html. But I'd argue that this is a band-aid and would just shadow the real issue of having a bad CPython env, and people should be encouraged to fix their dev envs (which is probably what this false-positive really reveals).
So here's some job/workflow name updates making them mostly fit in the UI:
- https://github.com/abhinavsingh/proxy.py/pull/669
@webknjaz I checked random repositories on Github and looks like along with setup.cfg
, repos are also adding a minimal setup.py
from setuptools import setup
setup()
See https://github.com/search?q=filename%3A.pyup.yml+setup.cfg&type=code for reference. I searched this to understand why pyup
broke after setup.py
removal. This seems the likely reason. Wdyt?
setuptools supports not having setup.py
for years now. I've been deleting it from all my projects since then. The only real problem was that pip install -e
didn't work without it, this is why many projects adopted a minimum file stub. But that's been fixed in the recent releases so it's not a concern anymore. The concern is providing a way to bypass PEP 517 by having setup.py
in the project.
As for pyup, I don't know. You could probably configure it to exclude setup.py
from its checks. It's a bad idea to pin the direct deps in package meta anyway.
Thank you, makes sense. I haven't kept up with changes of late.
I also observed codecov
integration is broken. Looking at the GHA logs, upload coverage step fails with "no coverage file found". Though, I see coverage files being generated just fine.
Any hints what might have gone wrong here. Root directory context doesn't change, so .coverage
should still be auto-discoverable.
Here's a doc on the config file: https://pyup.io/docs/bot/config/#specifying. Or you could just migrate to Dependabot like others did.
I think that the official codecov GHA looks for an XML report that pytest does not currently produce. I was going to fix it by add --cov-report=xml
but haven't got to it.
Here's a doc on the config file: https://pyup.io/docs/bot/config/#specifying. Or you could just migrate to Dependabot like others did.
Will deprecate pyup
for dependabot. I was just curious about what happened that it broke.
I think that the official codecov GHA looks for an XML report that pytest does not currently produce. I was going to fix it by add
--cov-report=xml
but haven't got to it.
Yep this probably should fix it :). Thank you
- https://github.com/abhinavsingh/proxy.py/pull/672
- https://github.com/abhinavsingh/proxy.py/pull/673
@webknjaz I realized that flake8
and pylint
now use 100% of all available cores. This typically happens for 5-10 seconds and is noticeable due to increase in fan speed. I am wondering if there is something we can (must) do about it. Thank you.
Oh, this reminds me that I forgot to increase the number of processes for pylint. flake8 already has auto
set by default but pylint's config just sets a single job.
It's usually best to use all the cores available to have better responsiveness. This will mean faster CI job and quicker results on the dev laptops.
A few more forgotten bits:
- https://github.com/abhinavsingh/proxy.py/pull/676
- https://github.com/abhinavsingh/proxy.py/pull/677
Cool, I hope pylint
config update will reduce the time for which CPUs are in a tight spin loop. I don't see a problem in utilizing all cores, but spinning up all available 8 cores for 10 seconds on a developer box is indeed not necessary. At least a choice must exist :). Currently I experience my MacBook fan spinning up for quite sometime after every flake8
and pylint
run.
Actually, per comment in the config 0
is auto
. Meaning it'll utilize more cores. I suppose you could run these tools manually and pass a different value to --jobs=
in CLI.
Import loop checker:
- https://github.com/abhinavsingh/proxy.py/pull/678
Coverage.py config:
- https://github.com/abhinavsingh/proxy.py/pull/679
A stricter dedicated pytest config:
- https://github.com/abhinavsingh/proxy.py/pull/680
Enhancement to what's tested in the CI vs what makes its way to the PyPI:
- https://github.com/abhinavsingh/proxy.py/pull/682
Pytest config hotfix:
- https://github.com/abhinavsingh/proxy.py/pull/683
I noticed that setuptools emits a barely noticeable warning about multiline package descriptions, so here's the fix:
- https://github.com/abhinavsingh/proxy.py/pull/684
@abhinavsingh how do you feel about replacing the hardcoded package version with one inherited from Git? Including versioning for the intermediate commits. I normally use setuptools-scm for this. I thought I'd set up GHA-based publishing and it'd be quite useful to have the version non-hardcoded.
@abhinavsingh how do you feel about replacing the hardcoded package version with one inherited from Git? Including versioning for the intermediate commits. I normally use setuptools-scm for this. I thought I'd set up GHA-based publishing and it'd be quite useful to have the version non-hardcoded.
I did envision it https://github.com/abhinavsingh/proxy.py/issues/102 but never got to it. Will be a bomb to have this. I'll be happy to have even more regular releases (which are not blocked on me personally). E.g. last version was released about 6-month back :(
@abhinavsingh alright. What's your release flow? Right now, having https://github.com/abhinavsingh/proxy.py/pull/682 merged, you can download artifacts (wrapped as one archive with GHA) from the workflow run, unpack them and publish with twine upload
. Do you normally have preparation activities besides re-hardcoding the version, like composing a changelog? If not, this would be quite easy to implement with two PRs.
I must warn that for Git-based versioning to work properly, in cases when the tag is first on another branch, that branch must be merged with a proper merge commit, not squash, not rebase (otherwise, the tag won't get into the other branch — develop/master/main/whatever — and git describe
won't be able to discover the proper tag).
One preparation activity is related to brew
where stable version is hardcoded within .rb
file. This happens because advertised brew formula for both stable
and development
versions points to the develop
branch. One option could be to have a single brew formula. One in the master branch points to stable version and the formulae in develop branch always points to the development version.
PS: Last I checked, brew formulae are not even working correctly, something broke in last 6-months. I think brew now expects cask
formulae and not a package formulae class. Of-course, this is only intended for macOS users and I am not even sure how many folks are using proxy.py
via brew. Outside of brew
most of the things are kept in sync during development i.e. develop
branch is always ready for publish a.k.a. merge into master
branch.
Typically, if all looks good locally, I follow these steps:
- Make a PR from
develop
->master
- Use
vX.Y.Z
as PR description (we must be using tags of-course) - Ensure PR is merge ready (GHA workflows are passing)
- Then locally on my laptop,
-
make lib-release-test
- Install from test PyPi and do a sanity check
-
make lib-release
-
- Merge the PR
In unfortunate circumstances, Step-4.2
may fail. A patch PR is sent into the develop branch to address the issue and process is repeated from Step-3
.
Currently, no separate CHANGELOG
file is shipped, in-fact README
contains minimal major-version change log in the footer. vX.Y.Z
PR description works as CHANGELOG
which is updated manually. These PRs are easily discoverable, though currently not linked anywhere (e.g. tags/releases). README
footer change log is updated to advertising major-version change reasons (e.g. architecture change, threaded, threadless etc).
For proxy.py
scenario, release workflow can be triggered for PRs from develop
-> master
. Workflow can follow Step-4
. Workflow can re-use PR title/description for tag name/description. Wdyt?
Actually, per comment in the config
0
isauto
. Meaning it'll utilize more cores. I suppose you could run these tools manually and pass a different value to--jobs=
in CLI.

I missed taking the screenshot at right time, otherwise User: 99%
was what I saw. This is my macbook CPU usage in last 2-minutes. I ran make lib-lint
3 times within this period, indicated by peaks.
IMHO, we'll need to address this at some point. I understand utilizing all cores, but I'll call this PyLint behavior as spinning all cores
. Problem is, it doesn't even finishes fast, takes minimum of ~10-seconds.
Also, have you seen how weirdly PyLint
behaves if a mypy
check has failed previously, PyLint
kind of just hangs for 10-20 seconds before resuming.
From process manager I see PyLint
goes on to spawn 100s of processes. Ideally, PyLint
must use a strategy similar to proxy.py
, maintain one loop per CPU and do things asynchronously within it, instead of spawning 100s of processes, all consuming 30-70% of CPU.
Another side-effect of PyLint
is, we are exhausting GHA quota's more quickly. And I am not surprised looking at the way PyLint
is behaving. My laptop literally becomes unresponsive. I fear running that PyLint
workflow locally, haha
I looked on GitHub and found several pull-requests into repos where they have replaced PyLint
with flake8
for exactly this reason (high CPU usage).
Typically, if all looks good locally, I follow these steps:
Make a PR from
develop
->master
Use
vX.Y.Z
as PR description (we must be using tags of-course)Ensure PR is merge ready (GHA workflows are passing)
Then locally on my laptop,
make lib-release-test
- Install from test PyPi and do a sanity check
make lib-release
Merge the PR
Thinking more on it:
- Sanity check in
Step-4.2
is simply re-running thee2e
workflows with test PyPi distribution. - (Optionally) I'll also look into how to put an
Approve
button on thePR
before executingStep-4.3
i.e. release to PyPi. This can be used as an opportunity to do a manual sanity check. At-least unless this specific workflow matures and we gain trust in it. Later on we can short circuit the Approval step. Just throwing out ideas here, I am unsure how easy is to add "Approval" button on PR via workflows. I see GitHub prompting me to "Approve and Run" workflows for first time contributors. - Once in place, we can schedule a daily/weekly release via workflows
About version names, I am happy to even get rid of SemVer in favor of simple DateTime based versioning system. SemVer while great for conveying major-minor-patch version, but is a PITA to maintain and onus lies on the authors/developers/maintainers to keep it aligned. At-least datetime based versioning will automate everything related to versioning & tags.
Though I am unsure how popular is the datetime based versioning system in the community OR what are the side-effects of migrating to datetime based versioning system.
I think codecov.yaml
is no longer being respected.
https://github.com/abhinavsingh/proxy.py/blob/develop/codecov.yml
Because we have configured codecov
to allow 1%
diff in coverage without breaking the CI. But looks like codecov
is now always unhappy even with 0.06%
decrease in coverage.
IIRC presence of the file is enough for codecov
to pick this up. Or has codecov
v2 action changed something. Will need to dig into it to confirm anything. Unsure why has it gone broke.
Re:codecov — AFAIK the action only uploads the reports but their backend reads the config separately from that via the GitHub App you have installed. I don't see the connection to my PR. Besides, that change was necessary since they are sunsetting their previous uploader soon and the migration is inevitable.
re:pylint — it is totally expected that it consumes more resources. By nature, it also has checks evaluating the code by dynamically importing it. This is why it's capable of revealing deeper problems.
As for burning through the GHA minutes, I doubt it's a problem. I use GHA in many places with a lot more jobs running and haven't been able to exceed the free limit. Besides, it's not a bottleneck of the GHA setup anyway: macOS and Windows workers consume the minutes with multipliers of 2x and 10x respectively. They have a far greater influence on the resource quota, especially taking into account that there's a matrix of them vs one small linting job.
FWIW I've also prepared a PR logging the execution time of each check:
- https://github.com/abhinavsingh/proxy.py/pull/692
I can also offer to create a fast linting tox env that would skip pylint, if that helps. You'd get those results from the CIs anyway. Plus I haven't enabled the caching of pre-commit's envs yet, which would probably cut some seconds from the linting run.
re:mypy — the pre-commit tool runs the checks in subprocesses, one after another. I don't see how a mypy run would influence pylint. But since pylint doesn't stream any logs until it finishes its execution, I would imagine that it may subjectively feel like it's taking longer. I think that the PR above will make the actual execution time clearer. Maybe mypy itself is slower when it sees errors, I don't know. But we'll be able to see it once the PR is merged. If it indeed influences the other process run, the only thing I'd suspect would be mac's process prioritization which I don't really know much about but I think it gives more priority to the graphical subsystem and could throttle CPU-intensive processes; since both of these are subprocesses of pre-commit, maybe it sees that one subprocess consumed a lot of resources and when the next one spawns it decides to limit its CPU time. But it's all a speculation at this point and I don't have tools to verify whether it's the case.
P.S. You can pass args to tox commands if they have {posargs}
. With lint, you could do tox -e lint -- mypy --all-files
to just run this one check and then compare the time it takes to complete (otherwise, use pre-commit run mypy --all-files
to run it without a wrapper).
Amazingly tox -e lint -- mypy --all-files
finished immediately. That's surprising. I have never seen it finish so fast when running with the entire suite.
$ time tox -e lint -- mypy --all-files ─╯
lint installed: astroid==2.8.4,attrs==21.2.0,backports.entry-points-selectable==1.1.0,bcrypt==3.2.0,cffi==1.15.0,cfgv==3.3.1,cryptography==35.0.0,distlib==0.3.3,filelock==3.3.2,identify==2.3.3,iniconfig==1.1.1,isort==5.10.0,lazy-object-proxy==1.6.0,mccabe==0.6.1,nodeenv==1.6.0,packaging==21.2,paramiko==2.8.0,platformdirs==2.4.0,pluggy==1.0.0,pre-commit==2.15.0,py==1.11.0,pycparser==2.20,pylint==2.11.1,pylint-pytest==1.0.3,PyNaCl==1.4.0,pyparsing==2.4.7,pytest==6.2.5,PyYAML==6.0,six==1.16.0,toml==0.10.2,types-cryptography==3.3.8,types-enum34==1.1.1,types-ipaddress==1.0.1,types-paramiko==2.7.3,virtualenv==20.10.0,wrapt==1.13.3
lint run-test-pre: PYTHONHASHSEED='905921385'
lint run-test: commands[0] | /Users/abhinavsingh/Dev/proxy.py/.tox/lint/bin/python -m pre_commit run --show-diff-on-failure --hook-stage manual mypy --all-files
mypy.....................................................................Passed
lint run-test: commands[1] | -/Users/abhinavsingh/Dev/proxy.py/.tox/lint/bin/python -c 'cmd = "/Users/abhinavsingh/Dev/proxy.py/.tox/lint/bin/python -m pre_commit install"; scr_width = len(cmd) + 10; sep = "=" * scr_width; cmd_str = " $ {cmd}";' 'print(f"\n{sep}\nTo install pre-commit hooks into the Git repo, run:\n\n{cmd_str}\n\n{sep}\n")'
______________________________________________ summary _______________________________________________
lint: commands succeeded
congratulations :)
tox -e lint -- mypy --all-files 0.97s user 0.25s system 98% cpu 1.242 total
flake8
timing information is similar when run in isolation
$ time tox -e lint -- flake8 --all-files ─╯
lint installed: astroid==2.8.4,attrs==21.2.0,backports.entry-points-selectable==1.1.0,bcrypt==3.2.0,cffi==1.15.0,cfgv==3.3.1,cryptography==35.0.0,distlib==0.3.3,filelock==3.3.2,identify==2.3.3,iniconfig==1.1.1,isort==5.10.0,lazy-object-proxy==1.6.0,mccabe==0.6.1,nodeenv==1.6.0,packaging==21.2,paramiko==2.8.0,platformdirs==2.4.0,pluggy==1.0.0,pre-commit==2.15.0,py==1.11.0,pycparser==2.20,pylint==2.11.1,pylint-pytest==1.0.3,PyNaCl==1.4.0,pyparsing==2.4.7,pytest==6.2.5,PyYAML==6.0,six==1.16.0,toml==0.10.2,types-cryptography==3.3.8,types-enum34==1.1.1,types-ipaddress==1.0.1,types-paramiko==2.7.3,virtualenv==20.10.0,wrapt==1.13.3
lint run-test-pre: PYTHONHASHSEED='306389043'
lint run-test: commands[0] | /Users/abhinavsingh/Dev/proxy.py/.tox/lint/bin/python -m pre_commit run --show-diff-on-failure --hook-stage manual flake8 --all-files
flake8...................................................................Passed
lint run-test: commands[1] | -/Users/abhinavsingh/Dev/proxy.py/.tox/lint/bin/python -c 'cmd = "/Users/abhinavsingh/Dev/proxy.py/.tox/lint/bin/python -m pre_commit install"; scr_width = len(cmd) + 10; sep = "=" * scr_width; cmd_str = " $ {cmd}";' 'print(f"\n{sep}\nTo install pre-commit hooks into the Git repo, run:\n\n{cmd_str}\n\n{sep}\n")'
________________________________________________________ summary _________________________________________________________
lint: commands succeeded
congratulations :)
tox -e lint -- flake8 --all-files 18.52s user 0.45s system 99% cpu 19.022 total
pylint
the worst of all even when run in isolation. Look at the 1445%
CPU and it takes 32.19s
for total execution.
time tox -e lint -- pylint --all-files ─╯
lint installed: astroid==2.8.4,attrs==21.2.0,backports.entry-points-selectable==1.1.0,bcrypt==3.2.0,cffi==1.15.0,cfgv==3.3.1,cryptography==35.0.0,distlib==0.3.3,filelock==3.3.2,identify==2.3.3,iniconfig==1.1.1,isort==5.10.0,lazy-object-proxy==1.6.0,mccabe==0.6.1,nodeenv==1.6.0,packaging==21.2,paramiko==2.8.0,platformdirs==2.4.0,pluggy==1.0.0,pre-commit==2.15.0,py==1.11.0,pycparser==2.20,pylint==2.11.1,pylint-pytest==1.0.3,PyNaCl==1.4.0,pyparsing==2.4.7,pytest==6.2.5,PyYAML==6.0,six==1.16.0,toml==0.10.2,types-cryptography==3.3.8,types-enum34==1.1.1,types-ipaddress==1.0.1,types-paramiko==2.7.3,virtualenv==20.10.0,wrapt==1.13.3
lint run-test-pre: PYTHONHASHSEED='3638090282'
lint run-test: commands[0] | /Users/abhinavsingh/Dev/proxy.py/.tox/lint/bin/python -m pre_commit run --show-diff-on-failure --hook-stage manual pylint --all-files
PyLint...................................................................Passed
lint run-test: commands[1] | -/Users/abhinavsingh/Dev/proxy.py/.tox/lint/bin/python -c 'cmd = "/Users/abhinavsingh/Dev/proxy.py/.tox/lint/bin/python -m pre_commit install"; scr_width = len(cmd) + 10; sep = "=" * scr_width; cmd_str = " $ {cmd}";' 'print(f"\n{sep}\nTo install pre-commit hooks into the Git repo, run:\n\n{cmd_str}\n\n{sep}\n")'
________________________________________________________ summary _________________________________________________________
lint: commands succeeded
congratulations :)
tox -e lint -- pylint --all-files 748.66s user 32.19s system 1445% cpu 54.020 total
Lately I am getting a lot of these coverage errors locally. Behavior is also kind of weird.
- When I hit the "green icon" to execute test, it runs fine.
- Next time when I hit it again, below exception is produced.
- Try again, works
- Try again, doesn't work
- Try again, works
- Doesn't work
.... this behavior is consistent

Made some changes to the workflow files. We might have to pipeline them into the check
step for a green CI.
- https://github.com/abhinavsingh/proxy.py/pull/699
Made some changes to the workflow files. We might have to pipeline them into the
check
step for a green CI.
At some point we may even have to abstract out linter steps, so that they can run when a matching file changes. After #699 no workflow will run for a README.md
only change, but we might still want to run the markdown
linter.
At some point we may even have to abstract out linter steps, so that they can run when a matching file changes.
It should be possible to achieve with job-level if
-clauses.
Lately I am getting a lot of these coverage errors locally. Behavior is also kind of weird.
This may be happening if you run some tests in parallel and they fight for access to the DB. I don't use that editor so I don't really know what you execute. Maybe try configuring it to run tox properly?
It should be possible to achieve with job-level
if
-clauses.
But won't we have to carry forward that clause into tox cli flags? Because IIUC, yamllint
, trailingcomma
etc are part of the workflow which also executes flake8
, mypy
etc. Please correct me if wrong.
But won't we have to carry forward that clause into tox cli flags?
I usually have a separate job to set multiple flags to prevent copy-paste and improve consistency across the workflow. Besides, I guess it could be separated if really needed. As for the flags, probably no, maybe an env var tho. It depends on the setup. This still feels like premature optimization to me. There are more minute-consuming parts of the matrix atm.
re:releases — now that the testing of the published thing is satisfied, I'll send a PR integrating setuptools-scm.
The following PR will add a workflow_dispatch
trigger allowing to type in the next desired version on the GH UI; when triggered, it'll run the tests, publish to TestPyPI and normal PyPI (after an additional approval), and then, it'll push a tag back to the repository. You'll run it off develop which will then get tagged and post-release you could merge develop (with the tag on it) into master (remember to use a merge commit).
I'm going to skip other release activities for now but if you want, I could add updating the brew formulae version in an additional commit.
This should align with a lot of other release workflows in the ecosystem.
Sounds good?
I'll send a PR integrating setuptools-scm.
Awesome :) +1
The following PR will add a
workflow_dispatch
trigger allowing to type in the next desired version on the GH UI
Even better :D
into master (remember to use a merge commit)
I am wondering if this is still necessary? My ideal preference is to have a single release PR merging into master branch. But if there are certain restrictions imposed by it on the workflows, I am happy to follow your recommendations.
I'm going to skip other release activities for now but if you want, I could add updating the brew formulae version in an additional commit.
Sounds good. At some point we can follow up with Docker
release. I think, docker still has traction compared to brew.
This should align with a lot of other release workflows in the ecosystem. Sounds good?
Sounds perfect to me :D :D :D +1
I am wondering if this is still necessary? My ideal preference is to have a single release PR merging into master branch. But if there are certain restrictions imposed by it on the workflows, I am happy to follow your recommendations.
Honestly, I don't fully understand the purpose of master
. Does it just track the latest stable release?
As for the merge, it could be a fast-forward as well. The thing is that SCM versioning follows the current commit's parents until it finds one that is tagged. If you cherry-pick that commit (which is basically what happens when you squash or rebase).
Yes, I recently added some doc about it, see https://github.com/abhinavsingh/proxy.py#stable-vs-develop . I am following a pattern which will be more fruitful in large projects + several contributors. If and when that happen, here is how this structure will be useful:
- Quick
vX.Y.Z-dev
releases fromdevelop
branch. Example everyday, even on every PR submission - Then, bi-weekly or monthly or based upon a release schedule, choose the best
-dev
tag and merge intomaster
and make the next stable release
I think in projects like proxy.py
, where end users are being served directly from a library, more validation is needed before promoting a release into production environments. In-worst case scenario:
- End-users can experience total browsing loss due to bug in
proxy.py
- There might be a security bug (several security researchers have reached out in the past with vulnerabilities they found)
I use it constantly on my local laptop (develop). We deploy it in our office network (stable or develop). We offer services to end users (always stable + patches).
Ideally, I want to give dev release a spin for a week before promoting to production.
What do you think about it?
I just noticed that there are no Git tags in the repo. How do you find the versions corresponding to the published dists? This is quite surprising. Could you at least push the latest tag for the last release retrospectively?
Here's the versioning bit:
- https://github.com/abhinavsingh/proxy.py/pull/715
It is based on tags so you should tag the previous version for this to properly calculate the value.
And the GHA bit:
- https://github.com/abhinavsingh/proxy.py/pull/716
This one will need to be tested after the merge because most of the additions don't run in pull requests.
I just noticed that there are no Git tags in the repo. How do you find the versions corresponding to the published dists? This is quite surprising. Could you at least push the latest tag for the last release retrospectively?
Sure, can start to add tags. We have release PR for every release titled vX.Y.Z
, so that's how we go back if necessary. And in general, master
branch is always the last release.
The problem is that GitHub is not Git. The automated tooling does not attempt to query GitHub APIs and does not have access to things that are not in Git.
Here's the versioning bit:
It is based on tags so you should tag the previous version for this to properly calculate the value.
And the GHA bit:
This one will need to be tested after the merge because most of the additions don't run in pull requests.
Please give me a couple of days. I'll jump on these PR soon. I also want to take this opportunity to take a tour of our current workflows. Quite a bit has changed and I haven't dug them much yet.
I usually start proxy locally using this command and looks like it is broken currently. Unsure how tests were working. May be because they had a _scm_version
generated file. But it won't be available locally.
$ python -m proxy ─╯
Traceback (most recent call last):
File "/Users/abhinavsingh/Dev/proxy.py/proxy/common/_version.py", line 5, in <module>
from ._scm_version import version as __version__ # noqa: WPS433, WPS436
ModuleNotFoundError: No module named 'proxy.common._scm_version'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/abhinavsingh/.pyenv/versions/3.10.0/lib/python3.10/runpy.py", line 187, in _run_module_as_main
mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
File "/Users/abhinavsingh/.pyenv/versions/3.10.0/lib/python3.10/runpy.py", line 146, in _get_module_details
return _get_module_details(pkg_main_name, error)
File "/Users/abhinavsingh/.pyenv/versions/3.10.0/lib/python3.10/runpy.py", line 110, in _get_module_details
__import__(pkg_name)
File "/Users/abhinavsingh/Dev/proxy.py/proxy/__init__.py", line 11, in <module>
from .proxy import entry_point, main, Proxy
File "/Users/abhinavsingh/Dev/proxy.py/proxy/proxy.py", line 18, in <module>
from .core.acceptor import AcceptorPool, ThreadlessPool, Listener
File "/Users/abhinavsingh/Dev/proxy.py/proxy/core/acceptor/__init__.py", line 11, in <module>
from .acceptor import Acceptor
File "/Users/abhinavsingh/Dev/proxy.py/proxy/core/acceptor/acceptor.py", line 23, in <module>
from proxy.core.acceptor.executors import ThreadlessPool
File "/Users/abhinavsingh/Dev/proxy.py/proxy/core/acceptor/executors.py", line 22, in <module>
from .work import Work
File "/Users/abhinavsingh/Dev/proxy.py/proxy/core/acceptor/work.py", line 18, in <module>
from ..event import eventNames, EventQueue
File "/Users/abhinavsingh/Dev/proxy.py/proxy/core/event/__init__.py", line 15, in <module>
from .manager import EventManager
File "/Users/abhinavsingh/Dev/proxy.py/proxy/core/event/manager.py", line 20, in <module>
from ...common.flag import flags
File "/Users/abhinavsingh/Dev/proxy.py/proxy/common/flag.py", line 22, in <module>
from .plugins import Plugins
File "/Users/abhinavsingh/Dev/proxy.py/proxy/common/plugins.py", line 20, in <module>
from .utils import bytes_, text_
File "/Users/abhinavsingh/Dev/proxy.py/proxy/common/utils.py", line 23, in <module>
from .constants import HTTP_1_1, COLON, WHITESPACE, CRLF, DEFAULT_TIMEOUT, DEFAULT_THREADLESS
File "/Users/abhinavsingh/Dev/proxy.py/proxy/common/constants.py", line 21, in <module>
from .version import __version__
File "/Users/abhinavsingh/Dev/proxy.py/proxy/common/version.py", line 13, in <module>
from ._version import __version__ # noqa: WPS436
File "/Users/abhinavsingh/Dev/proxy.py/proxy/common/_version.py", line 8, in <module>
__version__ = _get_dist('proxy.py').version # noqa: WPS440
File "/Users/abhinavsingh/Dev/proxy.py/venv310/lib/python3.10/site-packages/pkg_resources/__init__.py", line 466, in get_distribution
dist = get_provider(dist)
File "/Users/abhinavsingh/Dev/proxy.py/venv310/lib/python3.10/site-packages/pkg_resources/__init__.py", line 342, in get_provider
return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0]
File "/Users/abhinavsingh/Dev/proxy.py/venv310/lib/python3.10/site-packages/pkg_resources/__init__.py", line 886, in require
needed = self.resolve(parse_requirements(requirements))
File "/Users/abhinavsingh/Dev/proxy.py/venv310/lib/python3.10/site-packages/pkg_resources/__init__.py", line 772, in resolve
raise DistributionNotFound(req, requirers)
pkg_resources.DistributionNotFound: The 'proxy.py' distribution was not found and is required by the application
I usually start proxy locally using this command and looks like it is broken currently. Unsure how tests were working. May be because they had a
_scm_version
generated file. But it won't be available locally.
Few options probably will be:
- Write
_scm_version
whenproxy.py
startup. Because_scm_version
file will only be valid in agit
environment, this should be safe to do. - For environments where
proxy.py
is installed viapypi
, version will be extracted from the distribution.
For step-1
, we are talking about a write to file from within _version.py
. Alternatively, we can also add a make target which will write the _scm_version
file, then advertise the same via README
, because there will be folks trying to use it from source code. In current form, their first invocation will simply fail.
You mentioned sphinx
at some point and today I am thinking of having a Wiki/Doc site (probably via GitHub pages). One key reason is the growing README
. Among other things, new flags are added which grows the read me even further. Because of read me, not everything can be written in detail either.
I am wondering what is the recommended setup/workflow for sphinx. Generating a template is straightforward. But should document authors also build locally and submit the source in PR. Or should a workflow trigger the sphinx build and update the source into a branch (may be gh-pages
)? Open to suggestions.
I think it's error-prone to allow running it from the source. Plus it'd rather not make setuptools-scm
and Git runtime dependencies.
The reason is that it still requires the dependencies to be installed meaning that you'd need to do pip install -e .
to get the runtime deps that would generate that file anyway. I guess you happen to have that one dependency by accident but the normal contributor workflow is to install the deps after cloning the repo.
Besides, when it's installed in an editable mode, the correct metadata with the static version will be present in site-packages
already which is where proxy.py
will get the version info.
P.S. PyPA recommends having an src
-layout for a reason. This is the simplest way to prevent accidental testing of a Git checkout instead of testing what the actual users get installed. Having a flat layout may be useful for non-distributable projects, like web apps but for libs/apps like proxy.py
, I'd discourage this, specifically because of such side effects — it's possible to start depending on dev env side effects w/o realizing this which would only get revealed by the end-users too late.
You mentioned
sphinx
at some point and today I am thinking of having a Wiki/Doc site (probably via GitHub pages). One key reason is the growingREADME
. Among other things, new flags are added which grows the read me even further. Because of read me, not everything can be written in detail either.I am wondering what is the recommended setup/workflow for sphinx. Generating a template is straightforward. But should document authors also build locally and submit the source in PR. Or should a workflow trigger the sphinx build and update the source into a branch (may be
gh-pages
)? Open to suggestions.
The most popular way is to use RTD. Just go to https://readthedocs.org, log in via GitHub and create a project for this repo. After the connection is made, it will automatically re-build the website on push to develop
and other branches/tags as you set it up. It will also build a copy of the docs site for each PR allowing previews of the changes.
Traditionally, only the source code is kept in Git. Plus a tox integration for local and CI builds. No need for anything else.
The quickstart generator is a bit rusty and generates too much garbage. I can send a PR with a clean initial skeleton. Sphinx is initially RST-centric. If you prefer to use Markdown too, I can add a MyST integration that emerged this year and allows using RST/Sphinx roles and directives via an extended syntax. Also, the theme that is currently popular is Furo, it should be integrated too. Plus, the level of strictness should be tweaked.
I guess you happen to have that one dependency by accident but the normal contributor workflow is to install the deps after cloning the repo
Correct, I had setuptools-scm
already installed. Locally, for now I created the file under common/_scm_version.py
.
that you'd need to do
pip install -e .
to get the runtime deps that would generate that file anyway
I am not sure about this command. I tried the following steps (from current README):
- git clone ...
- cd proxy.py && python3 -m venv venv && source venv/bin/activate
- make lib-deps # Runs "pip install -r"
- python -m proxy -h
- Fails with exception
I also tried, pip install -e .
IMHO, a user who has checked out source code (intended contributor or not), should not have to install the package from source. Because, installation is not the intention and should not be a mandatory step. They could have just installed it from pypi
.
To replace make lib-deps
, I'll be happy to have pip install <some flag>
which can install the deps. But where will it keep the generate scm version for local dev version, is still a question. IMHO, correct solution is to just add a make target which will generate the file. Then we can advertise this step in the README as a mandatory step (after make lib-deps) to work with source code.
It will also build a copy of the docs site for each PR allowing previews of the changes.
This will be awesome
I can send a PR with a clean initial skeleton
🙏🙏🙏
If you prefer to use Markdown too, I can add a MyST integration
This will be cool. Markdown is indeed easy to work with (for me at-least :))
Also, the theme that is currently popular is Furo, it should be integrated too
I just checked it out, looks sleek.
Plus, the level of strictness should be tweaked.
Can you expand. Strictness of what?
Thank you
I guess you happen to have that one dependency by accident but the normal contributor workflow is to install the deps after cloning the repo
Correct, I had
setuptools-scm
already installed. Locally, for now I created the file undercommon/_scm_version.py
.
I was talking about the typing extensions one. setuptools-scm
is a build-time dependency, not runtime (and it shouldn't be a runtime one).
that you'd need to do
pip install -e .
to get the runtime deps that would generate that file anywayI am not sure about this command. I tried the following steps (from current README):
- git clone ...
- cd proxy.py && python3 -m venv venv && source venv/bin/activate
- make lib-deps # Runs "pip install -r"
- python -m proxy -h
- Fails with exception
This is because it's not a runtime dependency, it's build-time. And I didn't make use of Git and setuptools-scm
in runtime. The version is determined during the build, put in the right location, and then is available in runtime. It is also recorded in the metadata during the install.
I also tried,
pip install -e .
IMHO, a user who has checked out source code (intended contributor or not), should not have to install the package from source. Because, installation is not the intention and should not be a mandatory step. They could have just installed it from
pypi
.
If they install from PyPI, it gets the typing extension dependency. Without it, locally you still have to get it but then, you have a completely separate script to do the same thing. Because they are detached but must do the same thing, it puts a maintenance burden on you to keep the behaviors in sync. It's not sustainable.
The commonly accepted practice is to use --editable
installs: https://packaging.python.org/guides/distributing-packages-using-setuptools/#working-in-development-mode. There's no reason to reinvent the wheel only to maintain own set of scripts that the whole Python ecosystem is unfamiliar with and each person would have to learn to use instead of relying on something that is common knowledge.
Another point here is that in your snippet, you're mixing up two workflows — for the users and for the contributors. The former do not need to install a bunch of development and release dependencies (and manage them manually after that) — they just need that single runtime dep and that's it.
The latter, OTOH, need the dev/test deps (needing the release ones is questionable).
It's actually a bit interesting for the release deps, they are only needed on release and none of those user/contributor categories actually require them. twine
could be kept in the release tox env only + it'll be completely separate in GHA so technically you'll probably never need to run it locally ever again. And setuptools-scm
is a build-time dependency for the package itself but I've also added it to the release deps only to accommodate for the container generation not being fully integrated into the workflow (once it is, it should likely be dropped from there).
While writing the above, I realised that you've made multiple separate use-cases too interconnected withing one env and they could benefit from separate ones. Good thing that tox testing doesn't pollute your current venv but manages its own envs separately.
To replace
make lib-deps
, I'll be happy to havepip install <some flag>
which can install the deps. But where will it keep the generate scm version for local dev version, is still a question. IMHO, correct solution is to just add a make target which will generate the file. Then we can advertise this step in the README as a mandatory step (after make lib-deps) to work with source code.
If you want, you could add pip install -e .
to that target. Some projects add -e .
to a requirements file that includes others via -r
. I'd argue that it's for apps, not distributable, though. Plus with what I wrote above, you may want to rethink how you manage the deps. Most Python projects don't use a Makefile but rely solely on tox or nox. And that is their source of truth for running tests and other automation + maps to CIs and other systems seamlessly.
It will also build a copy of the docs site for each PR allowing previews of the changes.
This will be awesome
I can send a PR with a clean initial skeleton
praypraypray
You could go to RTD and create a project there already with a few clicks, no need to wait for the PR.
If you prefer to use Markdown too, I can add a MyST integration
This will be cool. Markdown is indeed easy to work with (for me at-least :))
It is easy but the common syntax is limited and doesn't support more advanced/flexible features that Sphinx relies on for interlinking. This is why MyST folks came up with some extensions to it making those features available. I still like RST more but since I recognize that for some people it's easier to start with MD, I integrate MyST nowadays, and then it is possible to have a mix of MD and RST documents in the project.
One thing to note is that the docstrings in code traditionally use RST and I haven't checked how well (and if!) it works with MD.
FWIW I recommend getting familiar with RST. Did you know Sphinx was originally invented for upstream docs.python.org? And now it's a de-facto standard within Python projects.
Also, the theme that is currently popular is Furo, it should be integrated too
I just checked it out, looks sleek.
It's by a pip maintainer. Nowadays many new projects start with it and many popular ones have switched. It is the only one I know of that takes into account accessibility and supports aligning with the system dark mode setting, for example.
Plus, the level of strictness should be tweaked.
Can you expand. Strictness of what?
Sphinx is too permissive by default sometimes and doesn't fail if you have broken references. It's not very pleasant to realize that you've been linking a non-existing thing for a year, is it? This is why, it's best to make it fail loudly so that when you add new content, you'd know if it's actually valid.
https://packaging.python.org/guides/distributing-packages-using-setuptools/#working-in-development-mode. There's no reason to reinvent the wheel only to maintain own set of scripts that the whole Python ecosystem is unfamiliar with and each person would have to learn to use instead of relying on something that is common knowledge.
I didn't see this section before. Makes more sense why you are recommending pip install -e .
For me personally, I haven't used this workflow before and initially I assumed package will need to be installed repeatedly. But that's not the case here. Another advantage I see is, python -m proxy -h
will no longer be needed, with editable install proxy -h
should work just fine.
Another point here is that in your snippet, you're mixing up two workflows — for the users and for the contributors. The former do not need to install a bunch of development and release dependencies (and manage them manually after that) — they just need that single runtime dep and that's it. The latter, OTOH, need the dev/test deps (needing the release ones is questionable). It's actually a bit interesting for the release deps, they are only needed on release and none of those user/contributor categories actually require them.
twine
could be kept in the release tox env only + it'll be completely separate in GHA so technically you'll probably never need to run it locally ever again. Andsetuptools-scm
is a build-time dependency for the package itself but I've also added it to the release deps only to accommodate for the container generation not being fully integrated into the workflow (once it is, it should likely be dropped from there).While writing the above, I realised that you've made multiple separate use-cases too interconnected withing one env and they could benefit from separate ones. Good thing that tox testing doesn't pollute your current venv but manages its own envs separately.
You have a point here. I am indeed trying to mix multiple use cases into a single workflow. If pip install -e .
is the standard, we should follow the same. Let me give it a try locally for a few days and see how is the experience.
If you want, you could add
pip install -e .
to that target
Yep will make this change next after my trial run :)
Most Python projects don't use a Makefile but rely solely on tox or nox.
We could do the same with proxy.py
and remove the Makefile altogether. For me, current Makefile is just a convenient shortcut/alias, nothing more than that :) We technically don't have a real make build system in place.
FWIW I recommend getting familiar with RST. Did you know Sphinx was originally invented for upstream docs.python.org? > And now it's a de-facto standard within Python projects.
Right, now that you have mentioned it, I remember reading about it. I have used Sphinx quite a bit in the past (~2012-2016) era. But haven't caught up with it ever since. RST
is indeed better for complex scenarios. Though, IIUC, 90% of the doc content will likely never need these complex bits.
Sphinx is too permissive by default sometimes and doesn't fail if you have broken references.
+1. We indeed don't want any broken links. That will be worst experience for users landing on such docs.
Right, now that you have mentioned it, I remember reading about it. I have used Sphinx quite a bit in the past (~2012-2016) era. But haven't caught up with it ever since.
RST
is indeed better for complex scenarios. Though, IIUC, 90% of the doc content will likely never need these complex bits.
I haven't seen docs sites that don't use automatic interlinking features, honestly. The only case is when the users don't realize what's possible. But then, when they learn about those features, they can't stop using them :)
@abhinavsingh meanwhile, I believe you could merge https://github.com/abhinavsingh/proxy.py/pull/716 right away. It doesn't change the CI setup currently present on develop
and adds some of the CD bits that will only be visible/usable post-merge.
Plus, I've written a few pointers there that can be completed post-merge.
@abhinavsingh I think I've unblocked #716. No need to wait for the TestPyPI account recovery.
Apologies, this was a mistake. I checked logs for upload artifacts by mistake and got confused. I have submitted #741 which should now enable test.pypi.org releases
Observing lib
workflows are failing after PR merges with Unable to find any artifact
error.

@abhinavsingh for testing the release workflow, I'd recommend requesting the CI/CD to publish alpha-versions. Like 2.4.0a0
, for example. Just to test the automation and when the thing gets released on PyPI, the users wouldn't get it automatically by mistake. Once you're sure that the automation is set up well, you'd be able to run stable releases right away.
As for TestPyPI, it is supposed to get releases in two cases: 1) when the release is requested (additionally to the normal PyPI, just not requiring approval) and 2) when something is merged to develop
(on pushes to that branch) — in this case, it is supposed to upload dev
-releases.
Let me check what's going on, I haven't looked into this repo's changes today yet.
Note that the dists are stored as artifacts within GHA CI/CD and used during normal testing. They are never published to (Test)PyPI from the PR builds.
Observing
lib
workflows are failing after PR merges withUnable to find any artifact
error.
This is because one of the safeguard checks failed. This is caused by a minor misconfiguration. I'll send a PR to fix it shortly.
@abhinavsingh https://github.com/abhinavsingh/proxy.py/pull/743 should fix it.
Like
2.4.0a0
, for example. Just to test the automation and when the thing gets released on PyPI, the users wouldn't get it automatically by mistake
Good point. I am also checking https://www.python.org/dev/peps/pep-0440/#developmental-releases and https://www.python.org/dev/peps/pep-0440/#pre-releases , we might have to align our version name with it.
I'd recommend requesting the CI/CD to publish alpha-versions.
Sorry, can you expand. Request from whom?
when something is merged to
develop
(on pushes to that branch) — in this case, it is supposed to uploaddev
-releases.
Makes sense
Like
2.4.0a0
, for example. Just to test the automation and when the thing gets released on PyPI, the users wouldn't get it automatically by mistakeGood point. I am also checking python.org/dev/peps/pep-0440/#developmental-releases and python.org/dev/peps/pep-0440/#pre-releases , we might have to align our version name with it.
a0
/ b0
/ rc0
/ dev0
are all pre-releases in terms of PEP 440 (PyPI labels them as such and pip install
requires an explicit --pre
option to "see" them). I'd keep dev
for automated release health-check with TestPyPI and use the rest (or just stable) for publishing to the normal PyPI.
I'd recommend requesting the CI/CD to publish alpha-versions.
Sorry, can you expand. Request from whom?
Go to https://github.com/abhinavsingh/proxy.py/actions/workflows/test-library.yml and click on Run workflow
. You'll then see a form with inputs. When you enter your desired version, you're requesting the CI/CD to perform publishing. This is what I meant earlier.
I noticed that the sdist is substantially bigger than the wheel you may want to do something about it: https://test.pypi.org/project/proxy.py/2.3.2.dev147/#files.
I haven't checked closely, but it seems like the sdist contains redundant content (like vendored normal and minimized JS/CSS files). We could drop some of them to reduce the size.
The wheel doesn't contain the dashboard front-end. Not sure if that's intended but it is certainly not tested within the lib
workflow. This is why it's on my list to merge them.
The wheel doesn't contain the dashboard front-end. Not sure if that's intended but it is certainly not tested within the
lib
workflow.
Yep we don't bundle dashboard. That is intentional because of size. Not everyone might use/require the dashboard. I was thinking of may be provide dashboard as pip install proxy.py[dashboard]
functionality. Underneath it will be a separate package. We can bundle it together with proxy.py
package, but there are too many bits to it. Dashboard has its own plugin ecosystem. One of the plugin is devtools
, which bundles entire chrome-devtools-inspector
into the proxy.py
release. This can significantly bump the pip distribution size, something we don't want to do by default.
Just throwing out some ideas here. May be, just may be, we can split out each plugin into a separate package. This will probably be too much but has several advantages. One advantage is, we as authors get an insight into which plugin is being used most by the community. Eventually, at some point, I also plan to build a community plugin system, where plugins can be discovered from GitHub and installed ad-hoc. E.g. pip install proxy.py[reverse-proxy,dashboard,modified-chunk,man-in-the-middle]
will install the intended plugins only. We can also use underneath setuptools
ecosystem to discover plugins under a top-level namespace (ps: I read about it sometime back, my understanding of it is still hazy). However, most repo provided plugin packages will only contain a single file. Except a few plugins e.g. ProxyPool
, I expect folks to use provided plugins as a base to build their real production grade plugins. Some folks have even published their own wrapper library e.g. https://github.com/Zusyaku/Termux-And-Kali-Linux-V2/tree/main/Proxverter-main#readme. By discovering community plugins, even external plugins will gain visibility within the proxy.py
eco-system.
But dashboard is different scenario. It's not a single file plugin. We might eventually put dashboard under its own repo once we move proxy.py
within an org.
I noticed that the sdist is substantially bigger than the wheel you may want to do something about it: https://test.pypi.org/project/proxy.py/2.3.2.dev147/#files.
I haven't checked closely, but it seems like the sdist contains redundant content (like vendored normal and minimized JS/CSS files). We could drop some of them to reduce the size.
Here is the breakdown. Don't see any any static files. There is a adblock.json
which is small and serves as an example for filter by regex plugin config. Can we just ship .pyc
, will that be of any help or even work?
$ ls -l proxy.py-2.3.2.dev147-py3-none-any.whl ─╯
-rw-r--r--@ 1 abhinavsingh staff 153014 Nov 16 01:52 proxy.py-2.3.2.dev147-py3-none-any.whl
$ unzip -vl ~/Downloads/proxy.py-2.3.2.dev147-py3-none-any.whl | sort -nr -k 1 ─╯
408832 138744 66% 107 files
74712 Defl:N 21534 71% 11-15-2021 17:04 1b365633 proxy.py-2.3.2.dev147.dist-info/METADATA
39103 Defl:N 8655 78% 11-15-2021 17:03 ad0fa7d0 proxy/http/proxy/server.py
15228 Defl:N 4141 73% 11-15-2021 17:03 4a48b462 proxy/http/handler.py
14177 Defl:N 3960 72% 11-15-2021 17:03 c4623aaa proxy/http/parser/parser.py
13470 Defl:N 3862 71% 11-15-2021 17:03 5082b7d9 proxy/common/flag.py
12472 Defl:N 3509 72% 11-15-2021 17:03 fbc1c8c5 proxy/http/server/web.py
9642 Defl:N 2556 74% 11-15-2021 17:03 2ca65c48 proxy/common/pki.py
9064 Defl:N 4802 47% 11-15-2021 17:04 5a6848dd proxy.py-2.3.2.dev147.dist-info/RECORD
8512 Defl:N 2769 68% 11-15-2021 17:03 0215ea32 proxy/common/utils.py
8347 Defl:N 2786 67% 11-15-2021 17:03 192a8c73 proxy/plugin/proxy_pool.py
8113 Defl:N 2744 66% 11-15-2021 17:03 069d0546 proxy/core/acceptor/threadless.py
6894 Defl:N 2518 64% 11-15-2021 17:03 fd8a85c1 proxy/proxy.py
6596 Defl:N 2207 67% 11-15-2021 17:03 424ea662 proxy/http/proxy/plugin.py
6264 Defl:N 1813 71% 11-15-2021 17:03 08728c34 proxy/core/event/subscriber.py
6260 Defl:N 2053 67% 11-15-2021 17:03 92cf2623 proxy/core/acceptor/executors.py
5945 Defl:N 1479 75% 11-15-2021 17:03 c092c486 proxy/http/inspector/transformer.py
5859 Defl:N 2099 64% 11-15-2021 17:03 9cae0b33 proxy/core/acceptor/acceptor.py
5385 Defl:N 1854 66% 11-15-2021 17:03 4508a395 proxy/plugin/reverse_proxy.py
5230 Defl:N 1496 71% 11-15-2021 17:03 e2896ea2 proxy/http/websocket/frame.py
4968 Defl:N 1521 69% 11-15-2021 17:03 d80a9c30 proxy/core/base/tcp_server.py
4590 Defl:N 1893 59% 11-15-2021 17:03 13e80c73 proxy/common/constants.py
4345 Defl:N 1619 63% 11-15-2021 17:03 2664beea proxy/http/inspector/devtools.py
4300 Defl:N 1622 62% 11-15-2021 17:03 e92dcc13 proxy/http/server/plugin.py
4280 Defl:N 1399 67% 11-15-2021 17:03 658e8d00 proxy/dashboard/dashboard.py
4110 Defl:N 1399 66% 11-15-2021 17:03 217c664e proxy/core/acceptor/pool.py
4066 Defl:N 1514 63% 11-15-2021 17:03 1e30bbaf proxy/http/plugin.py
3900 Defl:N 1343 66% 11-15-2021 17:03 390d0d53 proxy/core/base/tcp_tunnel.py
3874 Defl:N 1379 64% 11-15-2021 17:03 fff8cf93 proxy/core/event/dispatcher.py
3864 Defl:N 1370 65% 11-15-2021 17:03 3eec98cc proxy/http/websocket/client.py
3756 Defl:N 1454 61% 11-15-2021 17:03 7af70214 proxy/core/connection/pool.py
3587 Defl:N 1352 62% 11-15-2021 17:03 7a483791 proxy/common/plugins.py
3345 Defl:N 1281 62% 11-15-2021 17:03 817a73a3 proxy/http/url.py
3178 Defl:N 1081 66% 11-15-2021 17:03 e84e57da proxy/core/acceptor/listener.py
3085 Defl:N 1189 62% 11-15-2021 17:03 c4257e05 proxy/core/connection/connection.py
3078 Defl:N 1275 59% 11-15-2021 17:03 7c41bd76 proxy/plugin/filter_by_url_regex.py
3028 Defl:N 1040 66% 11-15-2021 17:03 ad239c25 proxy/plugin/shortlink.py
3027 Defl:N 1062 65% 11-15-2021 17:03 a2e843a5 proxy/plugin/mock_rest_api.py
3020 Defl:N 1113 63% 11-15-2021 17:03 884d7090 proxy/http/parser/chunk.py
2978 Defl:N 1166 61% 11-15-2021 17:03 d388d53b proxy/core/acceptor/work.py
2870 Defl:N 1105 62% 11-15-2021 17:03 1d1bd088 proxy/http/server/pac_plugin.py
2794 Defl:N 1268 55% 11-15-2021 17:03 15768f74 proxy/plugin/cloudflare_dns.py
2758 Defl:N 936 66% 11-15-2021 17:03 acbfeff4 proxy/dashboard/inspect_traffic.py
2755 Defl:N 1089 61% 11-15-2021 17:03 5e29a644 proxy/testing/test_case.py
2721 Defl:N 1015 63% 11-15-2021 17:03 4925ca6f proxy/core/event/queue.py
2386 Defl:N 940 61% 11-15-2021 17:03 702d3fe9 proxy/core/ssh/tunnel.py
2321 Defl:N 879 62% 11-15-2021 17:03 a1809494 proxy/core/event/manager.py
1986 Defl:N 796 60% 11-15-2021 17:03 b66a350d proxy/core/connection/server.py
1953 Defl:N 750 62% 11-15-2021 17:03 e210e934 proxy/plugin/web_server_route.py
1875 Defl:N 768 59% 11-15-2021 17:03 bdb6c470 proxy/dashboard/plugin.py
1841 Defl:N 764 59% 11-15-2021 17:03 09cffe4c proxy/plugin/cache/base.py
1795 Defl:N 805 55% 11-15-2021 17:03 829360cf proxy/plugin/cache/store/disk.py
1744 Defl:N 746 57% 11-15-2021 17:03 81e00d08 proxy/core/connection/client.py
1666 Defl:N 795 52% 11-15-2021 17:03 dd11356e proxy/http/parser/protocol.py
1655 Defl:N 808 51% 11-15-2021 17:03 21cb8ede proxy/plugin/modify_chunk_response.py
1635 Defl:N 635 61% 11-15-2021 17:03 8db8b3e5 proxy/plugin/__init__.py
1555 Defl:N 687 56% 11-15-2021 17:03 87535f39 proxy/plugin/modify_post_data.py
1540 Defl:N 645 58% 11-15-2021 17:03 7e424eb1 proxy/http/exception/http_request_rejected.py
1426 Defl:N 676 53% 11-15-2021 17:03 84a3a707 proxy/http/proxy/auth.py
1380 Defl:N 712 48% 11-15-2021 17:03 136b3358 proxy/plugin/filter_by_client_ip.py
1366 Defl:N 741 46% 11-15-2021 17:03 8880180d proxy/plugin/custom_dns_resolver.py
1362 Defl:N 699 49% 11-15-2021 17:03 8830fae5 proxy/plugin/filter_by_upstream.py
1352 Defl:N 629 54% 11-15-2021 17:03 aef2ba2b proxy/common/logger.py
1349 Defl:N 629 53% 11-15-2021 17:03 192f4887 proxy/http/codes.py
1323 Defl:N 647 51% 11-15-2021 17:03 ff78db19 proxy/http/exception/proxy_conn_failed.py
1312 Defl:N 656 50% 11-15-2021 17:03 1b26f5ab proxy/plugin/redirect_to_custom_server.py
1287 Defl:N 629 51% 11-15-2021 17:03 b152e803 proxy/http/exception/proxy_auth_failed.py
1147 Defl:N 651 43% 11-15-2021 17:03 65aa936c proxy/common/types.py
1017 Defl:N 540 47% 11-15-2021 17:03 3ee72a5e proxy/core/event/names.py
941 Defl:N 468 50% 11-15-2021 17:03 828e63c8 proxy/plugin/cache/store/base.py
920 Defl:N 494 46% 11-15-2021 17:03 eb6f60ab proxy/plugin/cache/cache_responses.py
856 Defl:N 440 49% 11-15-2021 17:03 b1be9370 proxy/http/parser/types.py
849 Defl:N 469 45% 11-15-2021 17:03 1721aba9 proxy/plugin/man_in_the_middle.py
843 Defl:N 487 42% 11-15-2021 17:03 caf5a5fa proxy/common/version.py
825 Defl:N 415 50% 11-15-2021 17:03 1505bb40 proxy/http/methods.py
822 Defl:N 473 43% 11-15-2021 17:03 0f26283c proxy/http/exception/base.py
822 Defl:N 435 47% 11-15-2021 17:03 058b0ba1 proxy/__init__.py
782 Defl:N 461 41% 11-15-2021 17:03 d504f647 proxy/core/ssh/client.py
756 Defl:N 378 50% 11-15-2021 17:03 dcd403a0 proxy/core/connection/__init__.py
732 Defl:N 243 67% 11-15-2021 17:03 f307e613 proxy/plugin/adblock.json
728 Defl:N 404 45% 11-15-2021 17:03 ba54afd1 proxy/http/parser/__init__.py
687 Defl:N 376 45% 11-15-2021 17:03 d7d9a9b3 proxy/http/exception/__init__.py
681 Defl:N 407 40% 11-15-2021 17:03 187b7081 proxy/common/_version.py
670 Defl:N 365 46% 11-15-2021 17:03 7b62b30e proxy/core/event/__init__.py
657 Defl:N 361 45% 11-15-2021 17:03 7a1a6b01 proxy/core/acceptor/__init__.py
654 Defl:N 355 46% 11-15-2021 17:03 37ae4fa0 proxy/http/server/__init__.py
653 Defl:N 366 44% 11-15-2021 17:03 6ea59ee8 proxy/http/__init__.py
589 Defl:N 344 42% 11-15-2021 17:03 544a87e8 proxy/dashboard/__init__.py
571 Defl:N 351 39% 11-15-2021 17:03 e6740d9b proxy/http/server/protocols.py
548 Defl:N 345 37% 11-15-2021 17:03 585780a9 proxy/core/connection/types.py
537 Defl:N 326 39% 11-15-2021 17:03 819f36df proxy/http/proxy/__init__.py
521 Defl:N 328 37% 11-15-2021 17:03 2a368121 proxy/http/websocket/__init__.py
517 Defl:N 317 39% 11-15-2021 17:03 5f1a6591 proxy/plugin/cache/__init__.py
510 Defl:N 320 37% 11-15-2021 17:03 b1609efe proxy/core/base/__init__.py
452 Defl:N 305 33% 11-15-2021 17:03 a0814865 proxy/core/ssh/__init__.py
439 Defl:N 302 31% 11-15-2021 17:03 97e387be proxy/http/inspector/__init__.py
426 Defl:N 301 29% 11-15-2021 17:03 52fe3600 proxy/__main__.py
412 Defl:N 293 29% 11-15-2021 17:03 2d9ee4cf proxy/testing/__init__.py
349 Defl:N 254 27% 11-15-2021 17:03 b27e1915 proxy/plugin/cache/store/__init__.py
349 Defl:N 254 27% 11-15-2021 17:03 b27e1915 proxy/core/__init__.py
349 Defl:N 254 27% 11-15-2021 17:03 b27e1915 proxy/common/__init__.py
159 Defl:N 125 21% 11-15-2021 17:04 415b7738 proxy/common/_scm_version.py
125 Defl:N 104 17% 11-15-2021 17:03 38d6430a proxy/common/_scm_version.pyi
92 Defl:N 92 0% 11-15-2021 17:04 aa32508b proxy.py-2.3.2.dev147.dist-info/WHEEL
70 Defl:N 65 7% 11-15-2021 17:03 735abf29 proxy/common/.gitignore
65 Defl:N 67 -3% 11-15-2021 17:03 d28e7db3 proxy/py.typed
45 Defl:N 43 4% 11-15-2021 17:04 c737d1c1 proxy.py-2.3.2.dev147.dist-info/entry_points.txt
6 Defl:N 8 -33% 11-15-2021 17:04 1e7dead1 proxy.py-2.3.2.dev147.dist-info/top_level.txt
Apologies, looked into wheel instead of sdist
. We are bundling everything including dashboard, menubar xcode project, that's why :). I think MANIFEST.in
was supposed to take care of it.
$ tar -tvzf proxy.py-2.3.2.dev147.tar.gz | sort -nr -k 5 ─╯
-rw-r--r-- 0 runner docker 632482 Nov 15 22:33 proxy.py-2.3.2.dev147/Dashboard.png
-rw-r--r-- 0 runner docker 344994 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/package-lock.json
-rw-r--r-- 0 runner docker 165548 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/fonts/fontawesome-webfont.ttf
-rw-r--r-- 0 runner docker 155758 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/bootstrap-4.3.1.min.css
-rw-r--r-- 0 runner docker 121233 Nov 15 22:33 proxy.py-2.3.2.dev147/shortlink.gif
-rw-r--r-- 0 runner docker 88145 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/jquery-3.4.1.min.js
-rw-r--r-- 0 runner docker 77160 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/fonts/fontawesome-webfont.woff2
-rw-r--r-- 0 runner docker 74654 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy.py.egg-info/PKG-INFO
-rw-r--r-- 0 runner docker 74654 Nov 15 22:33 proxy.py-2.3.2.dev147/PKG-INFO
-rw-r--r-- 0 runner docker 71539 Nov 15 22:33 proxy.py-2.3.2.dev147/README.md
-rw-r--r-- 0 runner docker 58072 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/bootstrap-4.3.1.min.js
-rw-r--r-- 0 runner docker 39103 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/proxy/server.py
-rw-r--r-- 0 runner docker 39070 Nov 15 22:33 proxy.py-2.3.2.dev147/ProxyPy.png
-rw-r--r-- 0 runner docker 37670 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/images/icons/icon-512x512.png
-rw-r--r-- 0 runner docker 30982 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/font-awesome-4.7.0.min.css
-rw-r--r-- 0 runner docker 29254 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/project.xcworkspace/xcuserdata/abhinav.xcuserdatad/UserInterfaceState.xcuserstate
-rw-r--r-- 0 runner docker 28733 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/test_http_parser.py
-rw-r--r-- 0 runner docker 22633 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/project.pbxproj
-rw-r--r-- 0 runner docker 22357 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/project.xcworkspace/xcuserdata/abhinavsingh.xcuserdatad/UserInterfaceState.xcuserstate
-rw-r--r-- 0 runner docker 21004 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/popper-1.14.7.min.js
-rw-r--r-- 0 runner docker 19863 Nov 15 22:33 proxy.py-2.3.2.dev147/.github/workflows/test-library.yml
-rw-r--r-- 0 runner docker 19396 Nov 15 22:33 proxy.py-2.3.2.dev147/.pylintrc
-rw-r--r-- 0 runner docker 15356 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/test_protocol_handler.py
-rw-r--r-- 0 runner docker 15228 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/handler.py
-rwxr-xr-x 0 runner docker 15072 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/images/icons/icon-256x256.png
-rw-r--r-- 0 runner docker 14177 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/parser/parser.py
-rw-r--r-- 0 runner docker 13470 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/flag.py
-rw-r--r-- 0 runner docker 12716 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/test_main.py
-rw-r--r-- 0 runner docker 12472 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/server/web.py
-rw-r--r-- 0 runner docker 11678 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/test_web_server.py
-rw-r--r-- 0 runner docker 11515 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/plugin/test_http_proxy_plugins.py
-rwxr-xr-x 0 runner docker 9793 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/images/icons/icon-192x192.png
-rw-r--r-- 0 runner docker 9642 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/pki.py
-rw-r--r-- 0 runner docker 8634 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/test_http_proxy_tls_interception.py
-rw-r--r-- 0 runner docker 8525 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy.py.egg-info/SOURCES.txt
-rw-r--r-- 0 runner docker 8512 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/utils.py
-rw-r--r-- 0 runner docker 8483 Nov 15 22:33 proxy.py-2.3.2.dev147/.flake8
-rw-r--r-- 0 runner docker 8347 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/proxy_pool.py
-rw-r--r-- 0 runner docker 8291 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/plugin/test_http_proxy_plugins_with_tls_interception.py
-rw-r--r-- 0 runner docker 8113 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/acceptor/threadless.py
-rwxr-xr-x 0 runner docker 7330 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/images/icons/icon-152x152.png
-rw-r--r-- 0 runner docker 6894 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/proxy.py
-rwxr-xr-x 0 runner docker 6721 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/images/icons/icon-144x144.png
-rw-r--r-- 0 runner docker 6596 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/proxy/plugin.py
-rw-r--r-- 0 runner docker 6264 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/event/subscriber.py
-rw-r--r-- 0 runner docker 6260 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/acceptor/executors.py
-rw-r--r-- 0 runner docker 5945 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/inspector/transformer.py
-rw-r--r-- 0 runner docker 5941 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/common/test_flags.py
-rw-r--r-- 0 runner docker 5859 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/acceptor/acceptor.py
-rwxr-xr-x 0 runner docker 5754 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/images/icons/icon-128x128.png
-rw-r--r-- 0 runner docker 5518 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/test_http_proxy.py
-rw-r--r-- 0 runner docker 5494 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/exceptions/test_http_proxy_auth_failed.py
-rw-r--r-- 0 runner docker 5385 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/reverse_proxy.py
-rw-r--r-- 0 runner docker 5230 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/websocket/frame.py
-rw-r--r-- 0 runner docker 4968 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/base/tcp_server.py
-rw-r--r-- 0 runner docker 4953 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/common/test_pki.py
-rw-r--r-- 0 runner docker 4865 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Base.lproj/Main.storyboard
-rw-r--r-- 0 runner docker 4776 Nov 15 22:33 proxy.py-2.3.2.dev147/examples/README.md
-rw-r--r-- 0 runner docker 4770 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/plugins/mock_rest_api.ts
-rw-r--r-- 0 runner docker 4737 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/core/test_connection.py
-rw-r--r-- 0 runner docker 4619 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/core/ws.ts
-rw-r--r-- 0 runner docker 4590 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/constants.py
-rw-r--r-- 0 runner docker 4470 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/proxy.ts
-rw-r--r-- 0 runner docker 4345 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/inspector/devtools.py
-rw-r--r-- 0 runner docker 4300 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/server/plugin.py
-rw-r--r-- 0 runner docker 4280 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/dashboard/dashboard.py
-rw-r--r-- 0 runner docker 4110 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/acceptor/pool.py
-rw-r--r-- 0 runner docker 4066 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/plugin.py
-rw-r--r-- 0 runner docker 3959 Nov 15 22:33 proxy.py-2.3.2.dev147/Makefile
-rw-r--r-- 0 runner docker 3938 Nov 15 22:33 proxy.py-2.3.2.dev147/.pre-commit-config.yaml
-rw-r--r-- 0 runner docker 3900 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/base/tcp_tunnel.py
-rw-r--r-- 0 runner docker 3874 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/event/dispatcher.py
-rw-r--r-- 0 runner docker 3869 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/test_url.py
-rw-r--r-- 0 runner docker 3864 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/websocket/client.py
-rw-r--r-- 0 runner docker 3756 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/connection/pool.py
-rw-r--r-- 0 runner docker 3698 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/AppDelegate.swift
-rw-r--r-- 0 runner docker 3598 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/core/test_event_dispatcher.py
-rw-r--r-- 0 runner docker 3587 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/plugins.py
-rw-r--r-- 0 runner docker 3499 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/test_chunk_parser.py
-rw-r--r-- 0 runner docker 3475 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/core/test_acceptor.py
-rw-r--r-- 0 runner docker 3438 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/test_proxy_protocol.py
-rw-r--r-- 0 runner docker 3408 Nov 15 22:33 proxy.py-2.3.2.dev147/examples/pubsub_eventing.py
-rw-r--r-- 0 runner docker 3357 Nov 15 22:33 proxy.py-2.3.2.dev147/CODE_OF_CONDUCT.md
-rw-r--r-- 0 runner docker 3345 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/url.py
-rw-r--r-- 0 runner docker 3230 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/test_circular_imports.py
-rw-r--r-- 0 runner docker 3204 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/testing/test_embed.py
-rw-r--r-- 0 runner docker 3178 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/acceptor/listener.py
-rw-r--r-- 0 runner docker 3085 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/connection/connection.py
-rw-r--r-- 0 runner docker 3078 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/filter_by_url_regex.py
-rw-r--r-- 0 runner docker 3028 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/shortlink.py
-rw-r--r-- 0 runner docker 3027 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/mock_rest_api.py
-rw-r--r-- 0 runner docker 3020 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/parser/chunk.py
-rw-r--r-- 0 runner docker 3018 Nov 15 22:33 proxy.py-2.3.2.dev147/setup.cfg
-rw-r--r-- 0 runner docker 2978 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/acceptor/work.py
-rw-r--r-- 0 runner docker 2939 Nov 15 22:33 proxy.py-2.3.2.dev147/check.py
-rw-r--r-- 0 runner docker 2878 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/proxy.html
-rw-r--r-- 0 runner docker 2870 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/server/pac_plugin.py
-rw-r--r-- 0 runner docker 2794 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/cloudflare_dns.py
-rw-r--r-- 0 runner docker 2758 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/dashboard/inspect_traffic.py
-rw-r--r-- 0 runner docker 2755 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/testing/test_case.py
-rw-r--r-- 0 runner docker 2723 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/common/test_utils.py
-rw-r--r-- 0 runner docker 2721 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/event/queue.py
-rw-r--r-- 0 runner docker 2715 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/core/test_conn_pool.py
-rw-r--r-- 0 runner docker 2707 Nov 15 22:33 proxy.py-2.3.2.dev147/.github/workflows/codeql-analysis.yml
-rw-r--r-- 0 runner docker 2617 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/core/test_acceptor_pool.py
-rw-r--r-- 0 runner docker 2582 Nov 15 22:33 proxy.py-2.3.2.dev147/examples/https_connect_tunnel.py
-rw-r--r-- 0 runner docker 2424 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/core/test_event_queue.py
-rw-r--r-- 0 runner docker 2395 Nov 15 22:33 proxy.py-2.3.2.dev147/tox.ini
-rw-r--r-- 0 runner docker 2386 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/ssh/tunnel.py
-rw-r--r-- 0 runner docker 2321 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/event/manager.py
-rw-r--r-- 0 runner docker 2303 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/core/test_listener.py
-rw-r--r-- 0 runner docker 2299 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/core/test_event_subscriber.py
-rw-r--r-- 0 runner docker 2103 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/core/plugin.ts
-rw-r--r-- 0 runner docker 2056 Nov 15 22:33 proxy.py-2.3.2.dev147/examples/web_scraper.py
-rw-r--r-- 0 runner docker 1986 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/connection/server.py
-rw-r--r-- 0 runner docker 1953 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/web_server_route.py
-rw-r--r-- 0 runner docker 1943 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/test_set_open_file_limit.py
-rw-r--r-- 0 runner docker 1899 Nov 15 22:33 proxy.py-2.3.2.dev147/pytest.ini
-rw-r--r-- 0 runner docker 1897 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/package.json
-rw-r--r-- 0 runner docker 1875 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/dashboard/plugin.py
-rw-r--r-- 0 runner docker 1844 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/core/plugins/inspect_traffic.ts
-rw-r--r-- 0 runner docker 1841 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/cache/base.py
-rw-r--r-- 0 runner docker 1804 Nov 15 22:33 proxy.py-2.3.2.dev147/examples/ssl_echo_server.py
-rw-r--r-- 0 runner docker 1795 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/cache/store/disk.py
-rw-r--r-- 0 runner docker 1744 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/connection/client.py
-rw-r--r-- 0 runner docker 1712 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/core/test_event_manager.py
-rw-r--r-- 0 runner docker 1666 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/parser/protocol.py
-rw-r--r-- 0 runner docker 1655 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/modify_chunk_response.py
-rw-r--r-- 0 runner docker 1635 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/__init__.py
-rw-r--r-- 0 runner docker 1579 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/exceptions/test_http_request_rejected.py
-rw-r--r-- 0 runner docker 1575 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/core/plugins/home.ts
-rw-r--r-- 0 runner docker 1555 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/modify_post_data.py
-rw-r--r-- 0 runner docker 1554 Nov 15 22:33 proxy.py-2.3.2.dev147/examples/websocket_client.py
-rw-r--r-- 0 runner docker 1542 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/test_websocket_client.py
-rw-r--r-- 0 runner docker 1540 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/exception/http_request_rejected.py
-rw-r--r-- 0 runner docker 1520 Nov 15 22:33 proxy.py-2.3.2.dev147/LICENSE
-rwxr-xr-x 0 runner docker 1506 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/integration/main.sh
-rw-r--r-- 0 runner docker 1458 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/plugin/utils.py
-rw-r--r-- 0 runner docker 1439 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.pyUITests/proxy_pyUITests.swift
-rw-r--r-- 0 runner docker 1434 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/test_websocket_frame.py
-rw-r--r-- 0 runner docker 1426 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/proxy/auth.py
-rw-r--r-- 0 runner docker 1403 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/rollup.config.js
-rw-r--r-- 0 runner docker 1380 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/filter_by_client_ip.py
-rw-r--r-- 0 runner docker 1366 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/custom_dns_resolver.py
-rw-r--r-- 0 runner docker 1362 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/filter_by_upstream.py
-rw-r--r-- 0 runner docker 1352 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/logger.py
-rw-r--r-- 0 runner docker 1349 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/codes.py
-rw-r--r-- 0 runner docker 1330 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Info.plist
-rw-r--r-- 0 runner docker 1323 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/exception/proxy_conn_failed.py
-rw-r--r-- 0 runner docker 1312 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/redirect_to_custom_server.py
-rw-r--r-- 0 runner docker 1287 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/exception/proxy_auth_failed.py
-rwxr-xr-x 0 runner docker 1231 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/images/icons/icon-32x32.png
-rw-r--r-- 0 runner docker 1208 Nov 15 22:33 proxy.py-2.3.2.dev147/.github/workflows/test-docker.yml
-rw-r--r-- 0 runner docker 1182 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/core/devtools.ts
-rw-r--r-- 0 runner docker 1162 Nov 15 22:33 proxy.py-2.3.2.dev147/examples/tcp_echo_server.py
-rw-r--r-- 0 runner docker 1147 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/types.py
-rw-r--r-- 0 runner docker 1050 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/fluentd.conf
-rw-r--r-- 0 runner docker 1033 Nov 15 22:33 proxy.py-2.3.2.dev147/examples/ssl_echo_client.py
-rwxr-xr-x 0 runner docker 1031 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/monitor_open_files.sh
-rw-r--r-- 0 runner docker 1025 Nov 15 22:33 proxy.py-2.3.2.dev147/.github/workflows/test-brew.yml
-rw-r--r-- 0 runner docker 1017 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/event/names.py
-rw-r--r-- 0 runner docker 1015 Nov 15 22:33 proxy.py-2.3.2.dev147/Dockerfile
-rw-r--r-- 0 runner docker 974 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/common/test_text_bytes.py
-rw-r--r-- 0 runner docker 961 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/manifest.json
-rw-r--r-- 0 runner docker 941 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/cache/store/base.py
-rw-r--r-- 0 runner docker 920 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/cache/cache_responses.py
-rw-r--r-- 0 runner docker 911 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.pyTests/proxy_pyTests.swift
-rw-r--r-- 0 runner docker 903 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Assets.xcassets/AppIcon.appiconset/Contents.json
-rw-r--r-- 0 runner docker 891 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/proxy.css
-rw-r--r-- 0 runner docker 867 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/plugins/traffic_control.ts
-rw-r--r-- 0 runner docker 856 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/parser/types.py
-rw-r--r-- 0 runner docker 852 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/plugins/shortlink.ts
-rw-r--r-- 0 runner docker 852 Nov 15 22:33 proxy.py-2.3.2.dev147/.github/workflows/test-dashboard.yml
-rw-r--r-- 0 runner docker 849 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/man_in_the_middle.py
-rw-r--r-- 0 runner docker 843 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/version.py
-rw-r--r-- 0 runner docker 841 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/core/plugins/settings.ts
-rw-r--r-- 0 runner docker 834 Nov 15 22:33 proxy.py-2.3.2.dev147/.github/ISSUE_TEMPLATE/bug_report.md
-rw-r--r-- 0 runner docker 825 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/methods.py
-rwxr-xr-x 0 runner docker 822 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/__init__.py
-rw-r--r-- 0 runner docker 822 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/exception/base.py
-rw-r--r-- 0 runner docker 782 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/ssh/client.py
-rw-r--r-- 0 runner docker 775 Nov 15 22:33 proxy.py-2.3.2.dev147/examples/tcp_echo_client.py
-rw-r--r-- 0 runner docker 771 Nov 15 22:33 proxy.py-2.3.2.dev147/.github/ISSUE_TEMPLATE/feature_request.md
-rw-r--r-- 0 runner docker 756 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/connection/__init__.py
-rwxr-xr-x 0 runner docker 732 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/chrome_with_proxy.sh
-rw-r--r-- 0 runner docker 732 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/adblock.json
-rw-r--r-- 0 runner docker 728 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/parser/__init__.py
-rw-r--r-- 0 runner docker 727 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.pyUITests/Info.plist
-rw-r--r-- 0 runner docker 727 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.pyTests/Info.plist
-rw-r--r-- 0 runner docker 717 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/core/plugins/inspect_traffic.json
-rw-r--r-- 0 runner docker 700 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/testing/test_test_case.py
-rw-r--r-- 0 runner docker 687 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/exception/__init__.py
-rw-r--r-- 0 runner docker 681 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/_version.py
-rw-r--r-- 0 runner docker 670 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/event/__init__.py
-rw-r--r-- 0 runner docker 657 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/acceptor/__init__.py
-rw-r--r-- 0 runner docker 654 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/server/__init__.py
-rw-r--r-- 0 runner docker 653 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/__init__.py
-rw-r--r-- 0 runner docker 600 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/.eslintrc.json
-rw-r--r-- 0 runner docker 589 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/dashboard/__init__.py
-rw-r--r-- 0 runner docker 576 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/core/plugins/inspect_traffic.html
-rw-r--r-- 0 runner docker 571 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/server/protocols.py
-rwxr-xr-x 0 runner docker 567 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/chrome_with_rdp.sh
-rw-r--r-- 0 runner docker 548 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/connection/types.py
-rw-r--r-- 0 runner docker 547 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/homebrew/stable/proxy.rb
-rw-r--r-- 0 runner docker 537 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/proxy/__init__.py
-rw-r--r-- 0 runner docker 521 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/websocket/__init__.py
-rw-r--r-- 0 runner docker 518 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/test/test.ts
-rw-r--r-- 0 runner docker 517 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/cache/__init__.py
-rw-r--r-- 0 runner docker 510 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/base/__init__.py
-rwxr-xr-x 0 runner docker 497 Nov 15 22:33 proxy.py-2.3.2.dev147/scm-version.sh
-rw-r--r-- 0 runner docker 490 Nov 15 22:33 proxy.py-2.3.2.dev147/SECURITY.md
-rw-r--r-- 0 runner docker 488 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/__init__.py
-rw-r--r-- 0 runner docker 474 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/homebrew/develop/proxy.rb
-rw-r--r-- 0 runner docker 466 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/core/plugins/inspect_traffic.js
-rw-r--r-- 0 runner docker 462 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/ContentView.swift
-rw-r--r-- 0 runner docker 452 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/ssh/__init__.py
-rw-r--r-- 0 runner docker 446 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Assets.xcassets/StatusBarButtonImage.imageset/[email protected]
-rw-r--r-- 0 runner docker 439 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/inspector/__init__.py
-rw-r--r-- 0 runner docker 426 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/__main__.py
-rw-r--r-- 0 runner docker 425 Nov 15 22:33 proxy.py-2.3.2.dev147/.coveragerc
-rw-r--r-- 0 runner docker 417 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/dashboard/test_dashboard.py
-rw-r--r-- 0 runner docker 412 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/testing/__init__.py
-rw-r--r-- 0 runner docker 389 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Assets.xcassets/StatusBarButtonImage.imageset/Contents.json
-rw-r--r-- 0 runner docker 387 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/Procfile
-rw-r--r-- 0 runner docker 365 Nov 15 22:33 proxy.py-2.3.2.dev147/.deepsource.toml
-rw-r--r-- 0 runner docker 349 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/testing/__init__.py
-rw-r--r-- 0 runner docker 349 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/plugin/__init__.py
-rw-r--r-- 0 runner docker 349 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/exceptions/__init__.py
-rw-r--r-- 0 runner docker 349 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/__init__.py
-rw-r--r-- 0 runner docker 349 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/dashboard/__init__.py
-rw-r--r-- 0 runner docker 349 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/core/__init__.py
-rw-r--r-- 0 runner docker 349 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/common/__init__.py
-rw-r--r-- 0 runner docker 349 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/benchmark/__init__.py
-rw-r--r-- 0 runner docker 349 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/cache/store/__init__.py
-rw-r--r-- 0 runner docker 349 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/__init__.py
-rw-r--r-- 0 runner docker 349 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/__init__.py
-rw-r--r-- 0 runner docker 343 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/xcuserdata/abhinavsingh.xcuserdatad/xcschemes/xcschememanagement.plist
-rw-r--r-- 0 runner docker 343 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/xcuserdata/abhinav.xcuserdatad/xcschemes/xcschememanagement.plist
-rw-r--r-- 0 runner docker 322 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/proxy_py.entitlements
-rw-r--r-- 0 runner docker 311 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/spec/helpers/browser.js
-rw-r--r-- 0 runner docker 266 Nov 15 22:33 proxy.py-2.3.2.dev147/codecov.yml
-rw-r--r-- 0 runner docker 249 Nov 15 22:33 proxy.py-2.3.2.dev147/pyproject.toml
-rw-r--r-- 0 runner docker 244 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/README.md
-rw-r--r-- 0 runner docker 238 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/project.xcworkspace/xcshareddata/IDEWorkspaceChecks.plist
-rw-r--r-- 0 runner docker 238 Nov 15 22:33 proxy.py-2.3.2.dev147/.gitignore
-rw-r--r-- 0 runner docker 235 Nov 15 22:33 proxy.py-2.3.2.dev147/requirements-testing.txt
-rw-r--r-- 0 runner docker 226 Nov 15 22:33 proxy.py-2.3.2.dev147/.editorconfig
-rw-r--r-- 0 runner docker 170 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/spec/support/jasmine.json
-rw-r--r-- 0 runner docker 159 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/_scm_version.py
-rw-r--r-- 0 runner docker 158 Nov 15 22:33 proxy.py-2.3.2.dev147/.dockerignore
-rw-r--r-- 0 runner docker 153 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/project.xcworkspace/contents.xcworkspacedata
-rw-r--r-- 0 runner docker 136 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/tsconfig.json
-rw-r--r-- 0 runner docker 125 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/_scm_version.pyi
-rw-r--r-- 0 runner docker 93 Nov 15 22:33 proxy.py-2.3.2.dev147/.yamllint
-rw-r--r-- 0 runner docker 80 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/proxy.pac
-rw-r--r-- 0 runner docker 70 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/.gitignore
-rw-r--r-- 0 runner docker 65 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/py.typed
-rw-r--r-- 0 runner docker 62 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Preview Content/Preview Assets.xcassets/Contents.json
-rw-r--r-- 0 runner docker 62 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Assets.xcassets/Contents.json
-rw-r--r-- 0 runner docker 59 Nov 15 22:33 proxy.py-2.3.2.dev147/MANIFEST.in
-rw-r--r-- 0 runner docker 50 Nov 15 22:33 proxy.py-2.3.2.dev147/.darglint
-rw-r--r-- 0 runner docker 45 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy.py.egg-info/requires.txt
-rw-r--r-- 0 runner docker 45 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy.py.egg-info/entry_points.txt
-rw-r--r-- 0 runner docker 42 Nov 15 22:33 proxy.py-2.3.2.dev147/requirements.txt
-rw-r--r-- 0 runner docker 38 Nov 15 22:33 proxy.py-2.3.2.dev147/requirements-tunnel.txt
-rw-r--r-- 0 runner docker 37 Nov 15 22:33 proxy.py-2.3.2.dev147/requirements-release.txt
-rw-r--r-- 0 runner docker 33 Nov 15 22:33 proxy.py-2.3.2.dev147/.github/FUNDING.yml
-rwxr-xr-x 0 runner docker 29 Nov 15 22:33 proxy.py-2.3.2.dev147/git-pre-commit
-rw-r--r-- 0 runner docker 28 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/.gitignore
-rwxr-xr-x 0 runner docker 18 Nov 15 22:33 proxy.py-2.3.2.dev147/git-pre-push
-rw-r--r-- 0 runner docker 15 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/.gitignore
-rw-r--r-- 0 runner docker 10 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/.gitignore
-rw-r--r-- 0 runner docker 6 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy.py.egg-info/top_level.txt
-rw-r--r-- 0 runner docker 1 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy.py.egg-info/not-zip-safe
-rw-r--r-- 0 runner docker 1 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy.py.egg-info/dependency_links.txt
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/testing/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/plugin/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/integration/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/exceptions/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/http/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/dashboard/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/core/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/common/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/benchmark/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/tests/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/testing/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/cache/store/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/cache/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/plugin/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/websocket/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/server/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/proxy/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/parser/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/inspector/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/exception/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/http/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/dashboard/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/ssh/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/event/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/connection/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/base/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/acceptor/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/core/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/common/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/proxy.py.egg-info/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.pyUITests/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.pyTests/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Preview Content/Preview Assets.xcassets/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Preview Content/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Base.lproj/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Assets.xcassets/StatusBarButtonImage.imageset/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Assets.xcassets/AppIcon.appiconset/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/Assets.xcassets/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/xcuserdata/abhinavsingh.xcuserdatad/xcschemes/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/xcuserdata/abhinavsingh.xcuserdatad/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/xcuserdata/abhinav.xcuserdatad/xcschemes/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/xcuserdata/abhinav.xcuserdatad/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/xcuserdata/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/project.xcworkspace/xcuserdata/abhinavsingh.xcuserdatad/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/project.xcworkspace/xcuserdata/abhinav.xcuserdatad/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/project.xcworkspace/xcuserdata/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/project.xcworkspace/xcshareddata/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/project.xcworkspace/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/proxy.py.xcodeproj/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/menubar/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/homebrew/stable/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/homebrew/develop/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/homebrew/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/helper/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/examples/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/test/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/images/icons/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/images/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/fonts/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/static/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/plugins/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/core/plugins/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/core/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/src/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/spec/support/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/spec/helpers/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/spec/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/dashboard/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/.github/workflows/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/.github/ISSUE_TEMPLATE/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/.github/
drwxr-xr-x 0 runner docker 0 Nov 15 22:33 proxy.py-2.3.2.dev147/
Meanwhile, I've sent a PR with a quick-and-dirty integration of that smoke test into pytest reducing the number of jobs needed: https://github.com/abhinavsingh/proxy.py/pull/746.
Initial sphinx docs PR (w/o RTD config):
- https://github.com/abhinavsingh/proxy.py/pull/747
I'll take a look at them shortly, meanwhile I saw something, it's minor
proxy -v
now returns proxy.py vv2.3.2-dev147.g35b643c.d20211114
Note the double vv
, I haven't dug much into it, just wanted to take a note about it for later follow up
Looked into RTFD
. Unfortunately, it requires access to GitHub private project dashboard. I am not even sure why. For that reason, won't be able to use RTFD
anymore. I am thinking to just use GH pages.
Looked into
RTFD
. Unfortunately, it requires access to GitHub private project dashboard. I am not even sure why. For that reason, won't be able to useRTFD
anymore. I am thinking to just use GH pages.
Nevermind, I realized project dashboard access is protected. Enable RTFD
for proxy.py
.

Used mkdocs
for now. Link here https://proxypy.readthedocs.io/en/latest/. Serves 404 for now, probably due to build delay, project might not be mkdocs
compliant. Probably we should change this to sphinx build
after ur #747
@abhinavsingh mkdocs is wrong, it's not Sphinx but a completely different project. Also, it's best to use sphinx htmldir. OTOH, these settings don't matter as they will be overridden by an RTD config in #747. Just make sure that the GH integration + the webhook are active + PR builds are enabled (this is a web UI only option).
I'll take a look at them shortly, meanwhile I saw something, it's minor
proxy -v
now returnsproxy.py vv2.3.2-dev147.g35b643c.d20211114
Note the double
vv
, I haven't dug much into it, just wanted to take a note about it for later follow up
Is it in the CI or locally? I think I saw this mistake in some of the GHA job/step names.
Is it in the CI or locally?
I noticed this today when quoting an example on another issue https://github.com/abhinavsingh/proxy.py/issues/729#issuecomment-971608552 . Looks like it only happen for Via
proxy header. Doesn't happen for proxy -v
, proxy -h
, neither in CI logs.
Dropping the v
from this constant must fix it.
PROXY_AGENT_HEADER_VALUE = b'proxy.py v' + \
__version__.encode('utf-8', 'strict')
@abhinavsingh this is happening because your make target generates the scm version stub incorrectly. You add a v
there and setuptools-scm
doesn't.
Correct, I'll address this separately, not a major issue until we make a final stable release.
Apologies, looked into wheel instead of
sdist
. We are bundling everything including dashboard, menubar xcode project, that's why :). I thinkMANIFEST.in
was supposed to take care of it.
setuptools-scm auto-includes everything in Git into an sdist (as you should ship full source so that it's usable by the downstream packagers). You can then exclude some files from the actual installation (and the wheel) via setup.cfg
.
With setuptools-scm, you normally don't need that MANIFEST.in
file at all.
Yep we don't bundle dashboard. That is intentional because of size. Not everyone might use/require the dashboard. I was thinking of may be provide dashboard as
pip install proxy.py[dashboard]
functionality. Underneath it will be a separate package. We can bundle it together withproxy.py
package, but there are too many bits to it. Dashboard has its own plugin ecosystem. One of the plugin isdevtools
, which bundles entirechrome-devtools-inspector
into theproxy.py
release. This can significantly bump the pip distribution size, something we don't want to do by default.
For this, you'd need extras and a separate project.
Just throwing out some ideas here. May be, just may be, we can split out each plugin into a separate package. This will probably be too much but has several advantages. One advantage is, we as authors get an insight into which plugin is being used most by the community. Eventually, at some point, I also plan to build a community plugin system, where plugins can be discovered from GitHub and installed ad-hoc. E.g.
pip install proxy.py[reverse-proxy,dashboard,modified-chunk,man-in-the-middle]
will install the intended plugins only. We can also use underneathsetuptools
ecosystem to discover plugins under a top-level namespace (ps: I read about it sometime back, my understanding of it is still hazy). However, most repo provided plugin packages will only contain a single file. Except a few plugins e.g.ProxyPool
, I expect folks to use provided plugins as a base to build their real production grade plugins. Some folks have even published their own wrapper library e.g. Zusyaku/Termux-And-Kali-Linux-V2@main
/Proxverter-main#readme. By discovering community plugins, even external plugins will gain visibility within theproxy.py
eco-system.
Two points here:
- You were probably thinking about
pkg_resouces
that is bundled withsetuptools
. It's not recommended to use it anymore. The modern way is to use https://importlib-metadata.readthedocs.io/en/latest/using.html (which is a third-party package for older versions but is included into stdlib too in newer CPythons). This is what you use to iterate the entry points that the plugins would specify in their packages. - You could also look into https://github.com/pytest-dev/pluggy which is central to how the plugin ecosystems work in pytest and tox.
I noticed that the sdist is substantially bigger than the wheel you may want to do something about it: test.pypi.org/project/proxy.py/2.3.2.dev147/#files. I haven't checked closely, but it seems like the sdist contains redundant content (like vendored normal and minimized JS/CSS files). We could drop some of them to reduce the size.
Here is the breakdown. Don't see any any static files. There is a
adblock.json
which is small and serves as an example for filter by regex plugin config. Can we just ship.pyc
, will that be of any help or even work?
No, you shouldn't worry about pyc. Always ship the py files.
$ unzip -vl ~/Downloads/proxy.py-2.3.2.dev147-py3-none-any.whl | sort -nr -k 1
Instead of this, you can use wheel unpack ~/Downloads/proxy.py-2.3.2.dev147-py3-none-any.whl
to get the files on disk.