Results 436 comments of John Sirois

I'm going to guess the CPython code doesn't re-implement zip but links ziplib and that's where the variance @dgkatz noted comes from. Some versions of ziplib support this and some...

Eek - I was living a fantasy. The zipimport lib is written in Python and does not track the zipfile stdlib. https://github.com/python/cpython/blob/3.9/Lib/zipimport.py#L360 - so it just gets this wrong and...

And .. that zipimport shortcoming is documented here: https://bugs.python.org/issue32959

@rom1504 it depends what you want to do with the PEX. Since you'll already have to split it in 2 and have more than 1 file to ship around, perhaps...

Ok, I have not used Spark before, but I'd guess you want the following config tweaking their example: ``` export PYSPARK_DRIVER_PYTHON=python # Do not set in cluster modes. export PYSPARK_PYTHON=./pyspark_pex_env.pex/__main__.py...

@rom1504 Pex does already build in parallel using your number of cores by default (see `--help` for `--jobs`). The build process looks like: 1. Single subprocess: `pip download` (this performs...

@dgkatz - finally looping back. was your issue related to the huge PEX issue @tentwelfths and @rom1504 encountered (>2GB PEX)? If so, I'd like to close this issue since @rom1504...

Some gotchas to avoid in either the initial implementation or with follow-ups: 1. Bifurcated resolves: https://github.com/python-poetry/poetry/issues/4381 2. Environment marker explosion: https://github.com/pdm-project/pdm/issues/449

There is one known impossible to handle case: when a #2-style ("platform agnostic") resolve needs to traverse an sdist. The sdist may require being (partially) built to extract python version...

This work seems separable into the following task graph: 1. Platform dependent locks with requirements.txt compatible output: #1401 2. a. Platform agnostic locks with requirements.txt compatible output: #1402 | b....