pip-tools
pip-tools copied to clipboard
Use of entry_points and project.scripts with pip-sync
What's the problem this feature will solve?
As an engineer looking to pip-sync
a Python project packaged using pip-tools
, I expect scripts defined with entry_points
(using setup.py) and/or project.scripts
(using pyproject.toml) to be installed.
Describe the solution you'd like
-
pip-sync
command results in the creation of entries in thebin
directory of the Python environment corresponding to the defined scripts -
pip-sync
command results in the installation of the root package (i.e., as ifpip install
of the root package was issued)
Alternative Solutions
- avoid using
pip-sync
:pip install
of the root package installsbin
directory entries for the scripts defined in the package, but doesn't have the benefit of usingpip-sync
. -
pip-sync
followed bypip install
.pip-sync
will uninstall the root package, so the follow-uppip install
is necessary for thebin
entries for the scripts to be re-created.
Additional context
The effect of this absence of this feature is described in this issue: https://github.com/jazzband/pip-tools/issues/1158
It's not that the scripts are ignored, it's that pip-sync
installs explicitly listed dependencies (in contrast to your second proposed solution). My take on this is at the linked issue, copied here:
I don't know which is the intended behavior when using a setup file as
pip-compile
's input -- including the top package itself or not.
I would personally expect the setup-file package itself to be omitted from the requirements files, to be in line with projects that use both setup and requirements files, without necessarily using pip-tools in any capacity. And if I want it included in this case, I'll make a simple
.in
file.
Note that in a case like this you probably want to install (only) the root project in editable mode, which AFAIU isn't really able to be specified with pip-sync PROJFILE
.
I'm not 110% against changing this behavior but this reflects my current understanding of what's intended by the project, and I welcome insight from other users and devs on the topic.
At the very least, I suggest adding this alternative method to the list:
$ echo '-e .' >dev-requirements.in
$ pip-compile dev-requirements.in
$ pip-sync dev-requirements.txt
I'll also note that in my own wrapper scripts, I do employ your second suggested alternative of following up syncs with pip install
, due to #896, which despite being closed, is AFAIK still an issue.
Hi, @AndydeCleyre, thanks for replying.
Regarding your suggestion:
$ echo '-e .' >dev-requirements.in $ pip-compile dev-requirements.in $ pip-sync dev-requirements.txt
I'd like to put the self-requirement into requirements.in
(which is used by my requirements.test.in
, which is used by my requirements.dev.in
), but it looks like pip-compile
can't parse the editable self-requirement and I see the error:
venv/bin/pip-compile --output-file requirements.test.txt requirements.test.in
WARNING: the legacy dependency resolver is deprecated and will be removed in future versions of pip-tools. The default resolver will be changed to 'backtracking' in pip-tools 7.0.0. Specify --resolver=backtracking to silence this warning.
error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [1 lines of output]
error in vault setup command: 'install_requires' must be a string or list of strings containing valid project/version requirement specifiers; Parse error at "'-e file:'": Expected W:(abcd...)
[end of output]
Not directly an issue, but along the way, the -e .
dependency is expanded to an absolute path (i.e., file:///home/me/path/to/my/project
when it ends up in downstream requirements.*.txt
files, which would be a problem for shared development.
A bit about my use case, which perhaps isn't a good fit for pip-tools:
I'm looking to use pip-tools for both development as well as for deployment to the target execution environment, the form of which is the same virtualenv used over time across deployed versions of the software. Essentially deployments are the result of a git pull
and (ideally) a pip-sync
. To be clear, I'm not using separate virtualenvs across versions, or using an isolated environment (e.g., as provided by docker), so what I'm doing is kind of old fashioned, but the possible benefit to me of a tool like pip-sync
is significant.
Am I barking up the wrong tree with this pattern and pip-tools?
Can you share the requirements.in here?
And at least the install_requires value in the setup.py?
requirements.in
:
-e .
django<4.1
django-filter
djangorestframework
django-basicauth
django-structlog
fs
internetarchive
Jinja2
psycopg2-binary
PyYAML<6,>=3.10
requests
s3fs
sentry-sdk
more-itertools
tqdm
GitPython
python-dotenv
celery
backoff
redis
prometheus-client
django-prometheus
temporalio
drf-spectacular
jsonschema<4,>=2.5.1
pyhumps
psutil
loky
filetype
sanitize-filename
Abridged setup.py
install_requires = []
with open("requirements.txt", encoding="utf-8") as f:
install_requires = list(f)
setup(
# ...
install_requires=install_requires,
# ...
You have a circular dependency situation defined. You're feeding requirements.txt
into setup.py
, and including the project itself (setup.py
) in requirements.txt
(and .in
). This isn't going to work.
So you may want to, in setup.py
, filter out the self-reference coming from requirements.txt
, or instead use a separate .txt
/.in
file for including the project itself. In other words: don't specify the project as a dependency of itself.
I'm also surprised you're feeding in the .txt
rather than the .in
, but that's of course your choice.
Note that I'm not well versed in what is and is not acceptable syntax for install_requires
elements.
I need to note now that I didn't realize listing a project as a dependency of itself might be an accepted pattern, but it seems it is/was/might-be. It's not currently handled by pip-tools.
See also:
- #1820
- #1685