confluent-kafka-python
confluent-kafka-python copied to clipboard
AttributeError: module 'os' has no attribute 'add_dll_directory' - Python 3.8 - Function Azure
Description
I've installed the module confluent_kafka==1.8.2 for a function azure and I've got and error:
Result: Failure Exception: AttributeError: module 'os' has no attribute 'add_dll_directory' Stack:
File "/azure-functions-host/workers/python/3.8/LINUX/X64/azure_functions_worker/dispatcher.py", line 355, in _handle__function_load_request func = loader.load_function(
File "/azure-functions-host/workers/python/3.8/LINUX/X64/azure_functions_worker/utils/wrappers.py", line 40, in call return func(*args, **kwargs)
File "/azure-functions-host/workers/python/3.8/LINUX/X64/azure_functions_worker/loader.py", line 127, in load_function mod = importlib.import_module(fullmodname)
File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 843, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/site/wwwroot/CopyTransactionHttpTrigger/__init__.py", line 8, in <module> from confluent_kafka import avro
File "/home/site/wwwroot/.python_packages/lib/site-packages/confluent_kafka/__init__.py", line 18, in <module> _delvewheel_init_patch_16682001180()
File "/home/site/wwwroot/.python_packages/lib/site-packages/confluent_kafka/__init__.py", line 9, in _delvewheel_init_patch_16682001180 os.add_dll_directory(libs_dir)
I don't know if the error is due to the confluent_kafka module. Should I add a path?, Anyone got an idea why os doesn't have that method?.
Checklist
Please provide the following information:
- [X] confluent-kafka-python and librdkafka version (
confluent_kafka.version()andconfluent_kafka.libversion()): - [ ] Apache Kafka broker version:
- [ ] Client configuration:
{...} - [X] Operating system: Linux
- [ ] Provide client logs (with
'debug': '..'as necessary) - [ ] Provide broker log excerpts
- [ ] Critical issue
Seems like a problem with delvewheel not being able to identify the platform correctly.
@edenhill then you think it's a problem of Azure infrastructure or configuration in azure? or should I install delvewheel in the function?
It's a bit weird because delvewheel should only be used on Windows, but it looks from the stacktrace that this is running on Linux. The windows wheel should not even be installable on linux.
Make sure you install the correct wheel (manylinux, not windows).
@edenhill I'm not sure if I get what you mean, I have the function on windows locally and I know that it's running in linux on Azure, so that should I install manylinux locally and deploy it again the function to Azure?. The function is in python 3.8 so it would be "manylinux2014". If I run/delopy the function from a linux environment, locally, do you think I can avoid this issue?
I've solved it, running the function in a virtual environtment and on linux locally, I deployed it from a linux machine instead of windows OS and it worked.
I am also getting same error.
background I develop on Windows but deployment to AWS is via linux (Gitlab ci/cd). For no reason working app started failing. Any idea @edenhill why this happened/happens?
Stack trace:
Hi @shot87 , I am facing the same problem in my Python 3.10 lambda. Did you find any solution?
@savanbthakkar as far as I remember my research lead to some topics saying that a package update was the issue. So I think my fix was adding into requirements
urllib3<2
please let me know if this fixes your issue or not
Hi @shot87 , @savanbthakkar I'm having the same issue so I upgraded my urllib3 to 2.0.7 but the issue is still there. Any help?
@Asabeaaa as far as I remember urllib3<2 should downgrade this library. In my case higher version of this library was conflicting with confluent kafka libs.
@shot87 I had to download the pandas and numpy whl files from PyPI and extract the libraries from there instead
Hi, I am facing the same problem. I'm endeavoring to deploy an Azure Function application that relies on various libraries, including numpy, torch, and pandas (you can find all the prerequisites listed in the attached image). While the application runs smoothly on my Ubuntu 22.0.4 laptop, I encounter a persistent issue upon deployment to the remote Azure Function. Specifically, I keep receiving the following error: "AttributeError: module 'os' has no attribute 'add_dll_directory'." I've provided an image displaying these errors for your reference.
In my attempts to resolve this issue, I explored several solutions available online including this one but nothing work for me. Additionally, I verified that the init.py file within my local pandas installation does not include a line invoking _delvewheel_patch_1_5_1(). Consequently, I suspect that the problem might stem from using an incorrect version of pandas intended for Windows rather than Linux.
AWS Lambda operates within a Linux environment, and as such, pushing packages installed on a Windows machine may result in errors. Although your local setup on Windows may function correctly, it's crucial to perform packaging and deployment on a Linux system to ensure seamless compatibility with AWS Lambda.
Someone has to resolve this, linux downloaded package creates another issues, rather we fix this forever