serverless-python-requirements
serverless-python-requirements copied to clipboard
AWS lambda, cannot import name 'etree' from 'lxml'
After uploading and running lambda function I receive the following error:
{
"errorMessage": "cannot import name 'etree' from 'lxml'
(/var/task/lxml/__init__.py)",
"errorType": "ImportError",
"stackTrace": [
My serverless.yml
has the following settings applied:
custom:
pythonRequirements:
dockerizePip: true
dockerFile: Dockerfile
plugins:
- serverless-python-requirements
I have lxml installed in my venv
(requirements.txt
file)
I tried to check package using sls package -> zipinfo
commands and I see that lxml
on board
I work under OS-X environment
Any help with this?
Up! The same error. How to fix it?
This sounds like a naming issue. Is it possible that there's another file named lxml
in the python discovery path?
From the error message above, it appears that there's a file at /var/task/lxml/__init__.py
- making the lxml
namespace a Python package, which would collide with the lxml
library.
Ref https://stackoverflow.com/a/13750302/244037
I had the same error, but discovered that locally I used python 3.7 and the lambda function was using 3.8. The package contains lxml/etree.cpython-37m-x86_64-linux-gnu.so
which might not get loaded in 3.8? Changing the lambda runtime to 3.7 resolved the problem.
Same issue using both python 3.7 and 3.8 runtime. Any clues?
[DEBUG] 2021-05-10T12:14:49.514Z 4eafe7bb-8db4-4974-9c07-67ceb7233766 failed to load parser Traceback (most recent call last): File "/var/task/lambda/init.py", line 139, in run parser = get_parser() File "/var/lang/lib/python3.7/importlib/init.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "
", line 1006, in _gcd_import File " ", line 983, in _find_and_load File " ", line 967, in _find_and_load_unlocked File " ", line 677, in _load_unlocked File " ", line 728, in exec_module File " ", line 219, in _call_with_frames_removed File "/var/task/lambda/parser.py", line 5, in from lxml import html File "/var/task/lxml/html/init.py", line 53, in from .. import etree ImportError: cannot import name 'etree' from 'lxml' (/var/task/lxml/init.py)
Same issue with python == 3.11
[ERROR] Runtime.ImportModuleError: Unable to import module 'Lambdas/article-info-extract-lambda/lambda-get-article-body': cannot import name 'etree' from 'lxml' (/var/task/lxml/__init__.py)
Same issue here with python3.11
Hey @manuelrech - are you able to provide a reproducible example? This looks like an issue with dependencies, not necessarily with the plugin, but having a reproducible example would definitely help verifying where the problem lies. Thanks in advance 🙇
I have the same error
Run into below aws lambda issue since last week.
{ "errorMessage": "Unable to import module 'app': lxml.html.clean module is now a separate project lxml_html_clean.\nInstall lxml[html_clean] or lxml_html_clean directly.", "errorType": "Runtime.ImportModuleError", "requestId": "81dfd8e6-af2e-4a1f-b4e7-a87f81d41c78", "stackTrace": [] }
I managed to solve it (but I had to downgrade to python 3.9, it was the only one that successfully installed lxml)
1 - clone https://github.com/aws/aws-lambda-base-images/tree/python3.9 2 - change the entrypoint in the dockerfile to this: ENTRYPOINT ["bash"] 3 - build the image: docker build -t awspy39 -f ./Dockerfile.python3.9 . 4 - run the container with some volume mapping, so you can retrieve the files: docker run -it -v ./out:/mnt/packaging awspy39:latest 5 - run your pip install. mine looked like this: cd /mnt/packaging; pip3.9 install requests==2.26.0 jusText beautifulsoup4 langdetect lxml[html_clean] -t . 6 - exit the container and zip your code with the built libraries 7 - upload your zip file to aws lambda, and choose runtime python 3.9
Thanks @cnmoro . I also made some progress with this issue by adding below lib to the requirements.txt file to install it during the bootstrap. lxml_html_clean This works with python3.9
Thanks both @a30001784 and @cnmoro providing more details. It further confirms that the issue with not necessarily with the plugin itself, so I'm going to close the issue for now (unfortunately I cannot convert it to a discussion). If it turns out the issue is within the plugin itself, we'll reopen it 💯