serverless-python-requirements
serverless-python-requirements copied to clipboard
Ever lambda has same size
I just figured, that every lambda deployed with serverless has the same size
dashboard: https://app.serverless.com/timpolyma/apps/birdzview/birdzview/dev/eu-central-1
endpoints:
POST - https://fjwyyf75v9.execute-api.eu-central-1.amazonaws.com/api/open-ai
GET - https://fjwyyf75v9.execute-api.eu-central-1.amazonaws.com/api/scores/all
GET - https://fjwyyf75v9.execute-api.eu-central-1.amazonaws.com/api/scores/{id}
functions:
fetch_open_ai: birdzview-dev-fetch_open_ai (105 MB)
get_all_scores: birdzview-dev-get_all_scores (105 MB)
get_scores_per_user: birdzview-dev-get_scores_per_user (105 MB)
layers:
pythonRequirements: arn:aws:lambda:eu-central-1:028655318971:layer:birdzview-dev-python-requirements:8
I think my structure might be wrong, or is that the intended behavior?
I placed all the files in one module /lib, which might be wrong.
However, It seems like only the the dependencies written in the PipFile matter.
The project needs some big dependencies, so I can't fully deploy it at the moment.
individual packaging didn't work for me.
Is there any way to further reduce the file size of the functions?
serverless.yml
provider:
name: aws
runtime: python3.9
region: eu-central-1
httpApi:
cors: true
# environment:
# PYTHONPATH: "/var/task/vendored:/var/runtime"
custom:
pythonRequirements:
layer: true
zip: true
slim: true
functions:
fetch_open_ai:
handler: handler._fetch_open_ai
events:
- httpApi:
path: /api/open-ai
method: post
get_all_scores:
handler: handler._get_all_scores
events:
- httpApi:
path: /api/scores/all
method: get
get_scores_per_user:
handler: handler._get_scores_per_user
events:
- httpApi:
path: /api/scores/{id}
method: get
plugins:
- serverless-python-requirement
handler.py
try:
import unzip_requirements
except ImportError:
pass
import json
from lib.open_ai.main import fetch_open_ai
from lib.scores.all_scores import get_all_scores
from lib.scores.scores_per_user import get_scores_per_user
def _fetch_open_ai(event, context):
data = json.loads(event['body'])
text = fetch_open_ai(data)
response = {'statusCode': 200, 'body': json.dumps(text)}
return response
def _get_all_scores(event, context):
get_all_scores()
response = {"statusCode": 200}
return response
def _get_scores_per_user(event, context):
id = event['pathParameters']['id']
print(id) # check README
result = get_scores_per_user(id)
response = {"statusCode": 200, "body": result}
return respons
Since same zip file is used for lambda deploy (if there are no individually option), it is the intended behavior.
There are the way to reduce the zip file of lambda function. Check Per-function requirements part in official docs.