kombu
kombu copied to clipboard
import exception raised in transport/redis "module 'redis' has no attribute 'client' "
Hi, Debugging dynamic imports is really beyond my skills. Although the issue occurs in Airflow pipeline execution, from the stack trace it looks like the kombu utils import is having troubles. I also made sure the redis client is properly installed.
Python 3.8.6
Name: kombu Version: 5.3.2
Name: apache-airflow Version: 2.7.2
Name: redis Version: 4.6.0
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/celery/executors/celery_executor_utils.py", line 199, in send_task_to_executor
result = task_to_run.apply_async(args=[command], queue=queue)
File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/task.py", line 594, in apply_async
return app.send_task(
File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/base.py", line 794, in send_task
with self.producer_or_acquire(producer) as P:
File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/base.py", line 929, in producer_or_acquire
producer, self.producer_pool.acquire, block=True,
File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/base.py", line 1344, in producer_pool
return self.amqp.producer_pool
File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/amqp.py", line 590, in producer_pool
self.app.connection_for_write()]
File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/base.py", line 826, in connection_for_write
return self._connection(url or self.conf.broker_write_url, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/base.py", line 877, in _connection
return self.amqp.Connection(
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/connection.py", line 201, in init
if not get_transport_cls(transport).can_parse_url:
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/transport/init.py", line 90, in get_transport_cls
_transport_cache[transport] = resolve_transport(transport)
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/transport/init.py", line 75, in resolve_transport
return symbol_by_name(transport)
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/utils/imports.py", line 59, in symbol_by_name
module = imp(module_name, package=package, **kwargs)
File "/usr/local/lib/python3.8/importlib/init.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "
[airflow@xxxxx-scheduler-1 airflow]$ ls /home/airflow/.local/lib/python3.8/site-packages/redis init.py pycache asyncio backoff.py client.py cluster.py commands compat.py connection.py crc.py credentials.py exceptions.py lock.py ocsp.py retry.py sentinel.py typing.py utils.py [airflow@xxxx-scheduler-1 airflow]$ vi /home/airflow/.local/lib/python3.8/site-packages/kombu/transport/redis.py
I suspect that the error is caused by the local module name redis.py
shadowing the redis
package module.
I have not reproduced or confirmed this my suspicion, but we are running into the same problem on our Airflow deployment, and restarting the scheduler by hand (the server that sends tasks to the queue) will often fix the issue.
I am posting this message here in case someone can tell me I am way off, or perhaps it helps come up with a workaround/fix for an actual bug.
I face the same issue. Did you find any workaround on this?
Airflow image: apache/airflow:2.7.0-python3.8
redis==5.0.0 kombu==5.3.1 celery==5.3.1
And the only extra libraries I install through pip are:
- boto3==1.33.1
- botocore==1.33.1
- aiobotocore>=2.5.4<3.0.0
- s3transfer<0.8.2
- s3fs==2023.12.2
- cx-Oracle==8.3.0
- pymongo==3.11.3
- google-api-python-client==2.22.0
- oauth2client==4.1.3
The deployment is with docker on ECS instances with 4 services (all of them are using the same image) webserver, scheduler, celery worker and celery flower. The RDS we use is serverless (Aurora PostgreSQL14.6) and the Celery backend is ElasticCache (redis 7.0.7)
Issue: Although scheduler after build runs smoothly, sometimes when we restart it, fails to send the tasks on redis so the worker can fetch and execute them (Showing the same described error). This is fixed only by restarting the scheduler service, which is temporary till the next restart.
P.S: I don't do anything with PYTHONPATH, PATH. Just add the dags and all custom python module in base_dags_folder.
Hi, I moved to latest airflow version. This issue is gone for me.
Le 21 févr. 2024 à 08:57, GPrks @.***> a écrit :
I face the same issue. Did you find any workaround on this?
Airflow image: apache/airflow:2.7.0-python3.8
And the only extra libraries I install through pip are:
boto3==1.33.1 botocore==1.33.1 aiobotocore>=2.5.4<3.0.0 s3transfer<0.8.2 s3fs==2023.12.2 cx-Oracle==8.3.0 pymongo==3.11.3 google-api-python-client==2.22.0 oauth2client==4.1.3 The deployment is with docker on ECS instances with 4 services (all of them are using the same image) webserver, scheduler, celery worker and celery flower. The RDS we use is serverless (Aurora PostgreSQL14.6) and the Celery backend is ElasticCache (redis 7.0.7)
Issue: Although scheduler after build runs smoothly, sometimes when we restart it, fails to send the tasks on redis so the worker can fetch and execute them (Showing the same described error). This is fixed only by restarting the scheduler service, which is temporary till the next restart.
P.S: I don't do anything with PYTHONPATH, PATH. Just add the dags and all custom python module in base_dags_folder.
— Reply to this email directly, view it on GitHub https://github.com/celery/kombu/issues/1815#issuecomment-1956078559, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMMM5QY3LQGIKGQDT2W7HVLYUWSHVAVCNFSM6AAAAAA6SMCCHGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNJWGA3TQNJVHE. You are receiving this because you authored the thread.
Thanks for the response @rdjouder . I suppose you mean 2.8.1 and which python version please?
Hi team,
this issue is also happening on our airflow service. It works for some time and then it randomly starts to fail. I also updated airflow to the latest version. Every time it starts to happen, we need to manually kill the airflow scheduler pod and it gets fixed.
Can please someone help here? Some information below:
Airflow version: 2.8.3 python version: 3.11 redis version: 4.6.0 kombu version: 5.3.5
Hi team,
texting again as this issue is becoming unsustainable. Can anyone help on this issue? Thanks a lot!
Hello @zar777, you could try adding redis>=4.5.2,<5.0.0,!=4.5.5 in your requirements.txt file
hey @GPrks ,
thanks a lot for your quick reply. Trying it right now. Will keep you posted. Thanks!
@GPrks So, does it work ?
Hi @DamnDam ,
I applied this change + I renamed one import of redis I was doing directly in the code and I deployed few days ago. So far no issues but we are doing more intensive tests to see if the issue is fixed. I will let you know.
Hi team,
We have exactly the same problem as @zar777 described. Adding redis>=4.5.2,<5.0.0,!=4.5.5 in the requirements.txt file didn't help. That's an error I see:
@dat-a-ish @acmarco @rdjouder
In the most recent version of Airflow (2.9.2) using the default docker-compose file, i'm having the same problem
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/celery/executors/celery_executor_utils.py", line 220, in send_task_to_executor
result = task_to_run.apply_async(args=[command], queue=queue)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/celery/app/task.py", line 594, in apply_async
return app.send_task(
^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/celery/app/base.py", line 797, in send_task
with self.producer_or_acquire(producer) as P:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/celery/app/base.py", line 932, in producer_or_acquire
producer, self.producer_pool.acquire, block=True,
^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/celery/app/base.py", line 1354, in producer_pool
return self.amqp.producer_pool
^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/celery/app/amqp.py", line 591, in producer_pool
self.app.connection_for_write()]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/celery/app/base.py", line 829, in connection_for_write
return self._connection(url or self.conf.broker_write_url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/celery/app/base.py", line 880, in _connection
return self.amqp.Connection(
^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/kombu/connection.py", line 201, in __init__
if not get_transport_cls(transport).can_parse_url:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/kombu/transport/__init__.py", line 90, in get_transport_cls
_transport_cache[transport] = resolve_transport(transport)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/kombu/transport/__init__.py", line 75, in resolve_transport
return symbol_by_name(transport)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/kombu/utils/imports.py", line 59, in symbol_by_name
module = imp(module_name, package=package, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/importlib/__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 995, in exec_module
File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
File "/home/airflow/.local/lib/python3.12/site-packages/kombu/transport/redis.py", line 282, in <module>
class PrefixedRedisPipeline(GlobalKeyPrefixMixin, redis.client.Pipeline):
^^^^^^^^^^^^
AttributeError: module 'redis' has no attribute 'client'
Hi @italovpg,
Try to update your requirements
file and set redis
to 4.6.0
version, i.e.:
redis==4.6.0
It helped me (at least I didn't face that problem since I updated it).
I had to upgrade Airflow from 2.9.2 to 2.9.3 to solve the issue.
redis==4.6.0 worked for me but upgrade Airflow from 2.9.2 to 2.9.3 didn't