serverless-python-requirements
serverless-python-requirements copied to clipboard
dockerizePip w/ requirements.txt referencing relative package path?
Hi My requirements looks like this (references a relative path). Now I understand this is failing due to the way the docker volume is being mounted, but might you be able to suggest a workaround?
requirements.txt
boto3
requests
../../packages/mypackage
....
31a13f1e4356: Pull complete
Digest: sha256:d944b5ae251d24a089c4cc8c889e861cca6ce0ea0da376c364eeebe9ea4cce58
Status: Downloaded newer image for lambci/lambda:build-python3.8
ERROR: Invalid requirement: '../../packages/mypackage' (from line 1 of /var/task/requirements.txt)
Hint: It looks like a path. File '../../packages/mypackage' does not exist.
I also tried putting a pre-built whl file under the project's local dir py-packages/mypackage.whl
pythonRequirements:
layer: true
dockerizePip: true
vendor: ./py-packages
Still get
Error: STDOUT: Processing ./py-packages/mypackage.whl
STDERR: WARNING: Requirement './py-packages/mypackage.whl' looks like a filename, but the file does not exist
ERROR: Could not install packages due to an EnvironmentError: [Errno 2] No such file or directory: '/var/task/py-packages/mypackage.whl'
requirements.txt
boto3
requests
./py-packages/mypackage.whl
#258
So I can get it to work if i do not reference the local mypackage in my requirements.txt and I only extract the mypackage.whl file under ./py-packages that is referenced by the vendor: ./py-packages config. (the doc could use an update for what goes in there exactly (i.e. an extracted package) as its unclear from the doc)
So I guess now I'm wondering how can I BOTH have my local package referenced in my requirements.txt file AND be able to use the package w/ the dockerizePip option? Seems like one can't do both. The other disadvantage is the transitive dependencies that mypackage.whl required no longer get auto pulled, I have to go track that down. Maybe I'm doing something wrong?
hey any luck here?
I am trying to do the same thing with private packages and CI/CD, any luck?
What I've found to work is to create your own Dockerfile and COPY into /var your repositories (it may be easier to put them in a folder and COPY the whole folder in).
Note: You may need to fudge your requirements.txt file to update the path reference to ../
What I've found to work is to create your own Dockerfile and COPY into /var your repositories (it may be easier to put them in a folder and COPY the whole folder in). Note: You may need to fudge your requirements.txt file to update the path reference to ../ or /var/. Nothing that can't be done in a shell script. You can also use VOLUME to mount it there as well.
That's the solution that also worked for me.
The following changes helped solve my issue. Note that I'm using Poetry.
- Create Dockerfile and copy local package to docker env. (local package can't be outside the project scope - e.g: ../../)
FROM lambci/lambda:build-python3.8
# Copy Core Module
COPY ./core /var/lib/core
- Create powershell script serverless-deploy.ps1, that changes the local package path in requirements.txt and wraps the deployment
# Export pyproject.toml to requirements.txt
poetry export --without-hashes -f requirements.txt -o requirements.txt --with-credentials
# Replace local package paths in requirements.txt (Win10 WSL)
bash -c "sed -i 's/core @ .*;/core @ file:\/\/\/var\/lib\/core;/g' requirements.txt"
serverless deploy
- Reference the Dockerfile in serverless.yml
custom:
pythonRequirements:
dockerizePip: true
dockerFile: ./Dockerfile
usePoetry: false