serverless-python-requirements
serverless-python-requirements copied to clipboard
Serverless python custom packaging with nested folders
Hi, I have this following project structure
src/
requirements.txt
function_a.py
function_b.py
common/
common_1.py
common_2.py
With this function configuration the zip artifacts contains the common and function_* lambda source files plus dependencies, which is what I want.
Serverless configuration - 1
functionA:
module: src
handler: function_a.handler
package:
include:
- src/common/**
- src/function_a.py
functionB:
module: src
handler: function_b.handler
package:
include:
- src/common/**
- src/function_b.py
Now I would like to move the function_b to a nested src folder src/function_b and have its own requirements.txt file
src/
requirements.txt
function_a.py
function_b/
function_b.py
requirements.txt
common/
common_1.py
common_2.py
but when I try to pack it using this serverless configuration the function_b zip file doesn't contain the /common/** sources but only the function_b and its dependencies.
Serverless configuration - 2
functionB:
module: src/function_b
handler: function_b.handler
package:
include:
- src/common/**
- src/function_b/function_b.py
The other relevant part of the serverless.yml file is this
pythonRequirements:
slim: true
useStaticCache: true
useDownloadCache: true
cacheLocation: './._cache'
staticCacheMaxVersions: 10
dockerizePip: non-linux
package:
individually: true
exclude:
- ./**
Is there something I am doing wrong or is this some plugin/serverless limitation/bug ?
Thanks!
I have the same problem. Trying to include common files into a function where the module option is used does not work. If I run a sls package -p ./pkg I see my common folder, but somehow when requirements are injected and the package is uploaded to AWS, the common folder is removed.
Have you had any luck with this?
Unfortunately not, in my current config all the produced artifacts contains all the dependencies plus all my code
@giuseppenicolais Have you tried clearing out the caches via sls requirements cleanCache and repackaging?
Hello,
I have this same problem. I have reusable code in a "common" folder at the root. I can't manage to include the common folder. Only func/test_func/handler.run appears with the required dependencies.
/
/common/{...}
/func/test_func/handler.py
/func/test_func/requirements.txt
Revelant part of my serverless code is:
package:
individually: true
include:
- "!**/*"
functions:
FunctionA:
module: func/test_func
handler: handler.run
package:
include:
- func/test_func/**
- common/**
Would I have to resort to symlinks ?
Thank you for your help
Packaging individually currently effectively “chroot”s into each functions folder and then treats that as it’s own micro stack. In a general/best practice python project you don’t include above your root directory even though you can with a hack or two.
A first simple recommendation is for anyone in this thread to consider moving their common files into a library which they publish to a pip repository (public or private) and then use the requirements.txt in each of your functions to include those common libs.
A second one if you want to keep it simple and within the same repo is to either symlink or pre-copy the files needed in each function as part of your deploy process. I haven’t confirmed if symlinking works, please test and report back.
So that’s what it does now. As far as what the plugin should do, I believe it should try to generally cover most common use cases and I agree with everyone above that this is a valid use case. If I get a little time in the next few days I’ll look at the code and see what it would take to fix this and either fix it or report back.
@miketheman yes I have just tried, same result. github project for reproducing the issue
https://github.com/giuseppenicolais/SPR-issue-435
Thanks for supporting this great plugin!
Hello,
Thank you for your input. A more permissive include would be great indeed. Meanwhile I found a workaround: I use "vendor". Certainly that it was not intended to be used like that, but it works !
package:
individually: true
functions:
FunctionA:
module: func/test_func
handler: handler.run
package:
include:
- "!**/*"
- func/test_func/handler.py
vendor: ./common
Content of vendor is copied into lambda 's root. Do you have any thought on the use of vendor ?
I accidentally stumbled upon the vendor-solution today too, and I got to say it turned out to work like a charm! :) I too, hope there's no internal behavior that opposes the use of it in some manner!
Hello,
Thank you for your input. A more permissive include would be great indeed. Meanwhile I found a workaround: I use "vendor". Certainly that it was not intended to be used like that, but it works !
package: individually: true functions: FunctionA: module: func/test_func handler: handler.run package: include: - "!**/*" - func/test_func/handler.py vendor: ./commonContent of vendor is copied into lambda 's root. Do you have any thought on the use of vendor ?
You may need to set useStaticCache to false.
Otherwise any change in "vendor: ./common" will not get updated if requirements.txt not change.
While developing my function, I am importing the common classes as:
from common.util import SomeUtil
However, when I package, I am seeing util directory in the function folder but the import statement is invalid. How should I manage this?
I am using this in serverless.yml for my func
vendor:
- src/common/**
Here is my directory structure
src
- f1
- f2
- common
- util
While developing my function, I am importing the common classes as:
from common.util import SomeUtilHowever, when I package, I am seeing util directory in the function folder but the import statement is invalid. How should I manage this?
I am using this in serverless.yml for my func
vendor: - src/common/**Here is my directory structure
src
f1
f2
common
- util
Did you find a solution for that by any chance?
This seems to be a more general problem than sharing between multiple functions, even with a single function if you have a nested/sub module like:
app
|\- __init__.py
|
+-- handler.py
|
+-- widgets
| +- __init__.py
| +- one.py
| \- two.py
|
\- other.py
then handler.py can from .other import * but not from .widgets.one - it'll error that there's no app.widgets.
And sure enough there isn't, if you download the .zip from Lambda, the installed version has only the top-level modules, and it's also doubled up with a vendored version in the zip root which does have them (but they're at the root, not inside app).
I have no idea why this is happening, I've tried pip install -t test -r my_requirements.txt just like the code seems to be doing, and it's absolutely fine - creates test/app/widgets as expected (and no test/widgets).