kombu icon indicating copy to clipboard operation
kombu copied to clipboard

Latest working celery/redis cannot inspect: Error: No nodes replied within time constraint.

Open jslusher opened this issue 5 years ago • 37 comments

  • [x] I have read the relevant section in the contribution guide on reporting bugs.
  • [x] I have checked the issues list for similar or identical bug reports.
  • [x] I have checked the pull requests list for existing proposed fixes.
  • [x] I have checked the commit log to find out if the bug was already fixed in the master branch.
  • [x] I have included all related issues and possible duplicate issues in this issue (If there are none, check this box anyway).

Mandatory Debugging Information

  • [x] I have included the output of celery -A proj report in the issue. (if you are not able to do this, then at least specify the Celery version affected).
  • [x] I have verified that the issue exists against the master branch of Celery.
  • [x] I have included the contents of pip freeze in the issue.
  • [x] I have included all the versions of all the external dependencies required to reproduce this bug.

Related Issues

  • https://github.com/celery/celery/issues/1456
  • https://github.com/celery/celery/issues/4688
  • https://github.com/celery/celery/issues/3453

Possible Duplicates

  • None

Environment & Settings

Celery version: 4.3.0

celery report Output:

software -> celery:4.3.0 (rhubarb) kombu:4.6.4 py:2.7.16
            billiard:3.6.1.0 redis:3.2.1
platform -> system:Linux arch:64bit
            kernel version:3.10.0-957.27.2.el7.x86_64 imp:CPython
loader   -> celery.loaders.app.AppLoader
settings -> transport:sentinel results:disabled

CELERY_QUEUES:
    (<unbound Queue celery -> <unbound Exchange celery(direct)> -> celery>,
 <unbound Queue fast -> <unbound Exchange fast(direct)> -> fast>,
 <unbound Queue slow -> <unbound Exchange slow(direct)> -> slow>,
 <unbound Queue mp-fast -> <unbound Exchange mp-fast(direct)> -> mp-fast>,
 <unbound Queue mp-slow -> <unbound Exchange mp-slow(direct)> -> mp-slow>)
BROKER_TRANSPORT_OPTIONS: {
    'master_name': 'staging'}
BROKER_URL: u'sentinel://redis-s1.example.domain.com:26379//'
CELERY_ALWAYS_EAGER: False
CELERY_DISABLE_RATE_LIMITS: True
CELERY_ACCEPT_CONTENT: ['json']
CELERYD_MAX_TASKS_PER_CHILD: 2000
CELERY_IMPORTS:
    ('tasks',)
CELERY_EAGER_PROPAGATES_EXCEPTIONS: True
CELERY_STORE_ERRORS_EVEN_IF_IGNORED: True
CELERY_IGNORE_RESULT: True
CELERY_TASK_SERIALIZER: 'json'

Steps to Reproduce

Required Dependencies

  • Minimal Python Version: 2.7.16
  • Minimal Celery Version: 4.3.0
  • Minimal Kombu Version: 4.6.4
  • Minimal Broker Version: redis 3.0.6
  • Minimal Result Backend Version: N/A or Unknown
  • Minimal OS and/or Kernel Version: N/A or Unknown
  • Minimal Broker Client Version: N/A or Unknown
  • Minimal Result Backend Client Version: N/A or Unknown

Python Packages

pip freeze Output:

ABN==0.4.2
address==0.1.1
akismet==1.0.1
amqp==2.5.1
asn1crypto==0.24.0
attrs==19.1.0
Authlib==0.11
Authomatic==0.0.13
awesome-slugify==1.6.2
Babel==2.6.0
backports.functools-lru-cache==1.5
billiard==3.6.1.0
bleach==1.5.0
boto==2.38.0
cachetools==3.1.1
cas-client==1.0.0
celery==4.3.0
certifi==2017.7.27.1
cffi==1.12.3
chardet==3.0.4
click==6.7
configparser==3.8.1
contextlib2==0.5.5
coverage==4.5.4
cryptography==2.0.3
cssselect==0.9.2
cycler==0.10.0
datadog==0.11.0
ddtrace==0.25.0
decorator==4.4.0
dnspython==1.16.0
docopt==0.4.0
docutils==0.15.2
elasticsearch==6.3.1
enum34==1.1.6
filelock==3.0.12
funcsigs==1.0.2
future==0.17.1
google-auth==1.6.2
hiredis==0.2.0
html5lib==0.9999999
httplib2==0.13.1
idna==2.8
importlib-metadata==0.19
ipaddress==1.0.22
isodate==0.5.4
itsdangerous==0.24
Jinja2==2.7.1
kafka-python==1.4.6
kiwisolver==1.1.0
kombu==4.6.4
lmtpd==6.0.0
lockfile==0.12.2
loginpass==0.2.1
lxml==3.6.1
mandrill==1.0.57
Markdown==2.2.1
MarkupSafe==0.18
matplotlib==2.2.4
mock==1.0.1
more-itertools==5.0.0
mysqlclient==1.3.9
netaddr==0.7.19
numpy==1.16.4
oauth2==1.9.0.post1
packaging==19.1
passlib==1.6.1
pathlib2==2.3.4
paypalrestsdk==0.6.2
Pillow==2.8.1
pluggy==0.6.0
psutil==5.6.3
py==1.8.0
pyasn1==0.4.6
pyasn1-modules==0.2.6
PyBabel-json==0.2.0
pybreaker==0.5.0
pycountry==18.2.23
pycparser==2.19
pycryptodome==3.8.2
PyJWT==0.4.1
pylibmc==1.6.0
pyparsing==2.4.2
pytest==3.5.0
pytest-cov==2.4.0
python-daemon==2.1.2
python-dateutil==2.1
pytz==2014.4
PyYAML==3.12
raven==5.31.0
redis==3.2.1
regex==2018.11.3
requests==2.7.0
rsa==4.0
salmon-mail==3.0.0
scandir==1.10.0
simple-db-migrate==3.0.0
simplejson==3.10.0
six==1.11.0
SQLAlchemy==1.0.6
subprocess32==3.5.4
sudz==1.0.3
termcolor==1.1.0
toml==0.10.0
tox==3.13.2
Unidecode==0.4.21
urllib3==1.25.3
uWSGI==2.0.17.1
vine==1.3.0
virtualenv==16.7.2
Werkzeug==0.11.15
WTForms==1.0.5
zipp==0.5.2

Other Dependencies

N/A

Minimally Reproducible Test Case

Expected Behavior

I expect celery -A app inspect ping (as well as other subcommands of celery inspect) to return output.

Actual Behavior

This configuration and version of celery/redis/sentinel has been working fine until just recently and I'm not sure what might have changed. I'm guessing it might have something to do with conflicting packages (given how many there are in this python env 👀 ) but I'm not sure what else to check. I can verify by looking at the keys in redis and also by using tcpdump that celery is definitely able to reach the redis servers using the sentinel brokers. The deployment of celery is also serving tasks and otherwise seems to be working normally. I can't though for some reason run any of the inspect like commands without getting Error: No nodes replied within time constraint.

The only thing I see in the debug logs is again proof that the celery workers are getting the message, but still nothing comes back:

[2019-08-20 16:34:23,472: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'routing_key': 'dbc97d66-fe94-3d6d-aa6a-bb965893ae2b', 'exchange': 'reply.celery.pidbox'} ticket:19949cbb-6bf0-4b36-89f7-d5851c0bddd0]

We also captured redis traffic using MONITOR and we can see that pings are being keyed and populated: https://gist.github.com/jslusher/3b24f7676c93f90cc55e1330f6e595d8

jslusher avatar Aug 20 '19 23:08 jslusher

did you check the comments of this issue? https://github.com/celery/celery/issues/4688

auvipy avatar Aug 21 '19 03:08 auvipy

I can confirm this issue. It seems to be a result of a recent kombu release - 4.6.3 is working while 4.6.4 is not. I'm going to keep digging.

halfdan avatar Aug 21 '19 05:08 halfdan

I can confirm this issue. It seems to be a result of a recent kombu release - 4.6.3 is working while 4.6.4 is not. I'm going to keep digging.

should we move this issue to kombu repo then?

auvipy avatar Aug 21 '19 06:08 auvipy

For reference:

Working

software -> celery:4.3.0 (rhubarb) kombu:4.6.3 py:3.7.3
            billiard:3.6.1.0 redis:3.3.8
platform -> system:Darwin arch:64bit
            kernel version:18.6.0 imp:CPython
loader   -> celery.loaders.app.AppLoader
settings -> transport:redis results:redis://localhost:6379/

Not working

software -> celery:4.3.0 (rhubarb) kombu:4.6.4 py:3.7.3
            billiard:3.6.1.0 redis:3.3.8
platform -> system:Darwin arch:64bit
            kernel version:18.6.0 imp:CPython
loader   -> celery.loaders.app.AppLoader
settings -> transport:redis results:redis://localhost:6379/

halfdan avatar Aug 21 '19 06:08 halfdan

should we move this issue to kombu repo then?

Sounds reasonable.

halfdan avatar Aug 21 '19 06:08 halfdan

https://github.com/celery/kombu/issues/1081 seems related

halfdan avatar Aug 21 '19 06:08 halfdan

with celery 4.3 dont use kombu 4.6, could you try kombu 4.6.4 with celery 4.4.0rc3?

auvipy avatar Aug 21 '19 07:08 auvipy

@auvipy Happy to try, but if kombu 4.6 shouldn't be used with celery 4.3 I suggest to update https://github.com/celery/celery/blob/master/requirements/default.txt#L3 since it requires kombu>=4.6.4,<5.0 currently (master) and kombu>=4.4.0,<5.0.0 in the Celery 4.3.0 release. Celery 4.3.0 is broken as a result. Can we do a patch release 4.3.1 where kombu is set to kombu>=4.4.0,<4.6.4?

halfdan avatar Aug 21 '19 08:08 halfdan

@auvipy This is also broken in 4.4.0rc3

software -> celery:4.4.0rc3 (cliffs) kombu:4.6.4 py:3.7.3
            billiard:3.6.1.0 redis:3.3.8
platform -> system:Darwin arch:64bit
            kernel version:18.6.0 imp:CPython
loader   -> celery.loaders.app.AppLoader
settings -> transport:redis results:redis://localhost:6379/

halfdan avatar Aug 21 '19 08:08 halfdan

OK thanks for verifying

auvipy avatar Aug 21 '19 10:08 auvipy

may be https://github.com/celery/kombu/issues/1090 is a duplicate or related issue

auvipy avatar Aug 27 '19 06:08 auvipy

I see this issue is now closed. is it resolved in master? What’s the PR that fixes the problem?

yarinb avatar Aug 31 '19 18:08 yarinb

try this https://github.com/celery/kombu/pull/1089

auvipy avatar Aug 31 '19 18:08 auvipy

@yarinb Let me know how it is, I provided proper unit test coverage but I want feedback, because I have extensive Celery/Kombu experience using AMQP and I also have some decent Redis background so I wanted to learn more about kombu library by trying to fix this regression but keeping the original Redis API optimization intent from @auvipy .

matteius avatar Sep 01 '19 03:09 matteius

https://github.com/celery/kombu/issues/1091

auvipy avatar Sep 02 '19 07:09 auvipy

Thanks for looking into the issue!

I'm a little confused about how to proceed now that this issue is closed. Is there a patch version for kombu on the way? Should I wait for that or should I lock my celery and kombu versions to something specific in my requirements.txt? I would rather not downgrade celery if I can help it. I also would like to keep a version lock of kombu itself out of my requirements.txt if possible, especially if there's a patch on the horizon.

jslusher avatar Sep 10 '19 22:09 jslusher

yes 4.6.5 is underway with celery 4.4.0rc4 / final

auvipy avatar Sep 11 '19 04:09 auvipy

Looking forward to the update! This issue was driving me crazy.

jacobbridges avatar Sep 12 '19 15:09 jacobbridges

@jslusher I would definitely recommend pinning all of the celery requirements, for example:

note these are old

celery==4.2.1 kombu==4.2.2.post1 amqp==2.3.2 billiard==3.5.0.3

The reason to pin is so that you decide to upgrade from stable versions when you are ready to put time into monitoring and potentially troubleshooting any new issues. Often times in past releases version of Celery, kombu and py-amqp were paired in a way that weren't always compatible, especially if you pin something like Celery and let kombu update freely.

I still have not heard 100% confirmation that this patch I did has resolved these issues, but I would encourage you to pin kombu to master of github until a release is made in pypi including this patch, but if you can do this now you can help verify that not more work is required.

Pip will let you specify this master branch in your requirements by replacing your kombu requirement with: git+https://github.com/celery/kombu.git

matteius avatar Sep 13 '19 03:09 matteius

Was having this issue as well with kombu 4.6.4, downgraded it to version 4.6.3 and it now works.

AbrahamLopez10 avatar Sep 19 '19 00:09 AbrahamLopez10

I had the same issue. It seems when I installed celery, it also installed the development version of kombu, which is currently 4.6.5. I uninstall kombu and downgraded to the stable version, which is 4.5.0. It's working now.

joshlsullivan avatar Oct 04 '19 13:10 joshlsullivan

Not sure if I am having the same issue with celery==4.4.6 and kombu==4.6.11 or if I need to make a new ticket? I have a chord that randomly stops, then when I check with inspect I get this error. Using rabbitmq as the broker and redis as the backend.

bgmiles avatar Oct 15 '20 19:10 bgmiles

Also seeing this issue with these versions:

  • celery = "5.0.5"
  • kombu = "5.0.2"
  • redis = "3.5.3"

I was thinking that the comment about it being fixed in latest kombu meant that the v5 versions would work?

erewok avatar Feb 02 '21 22:02 erewok

Prematuraly closed IMO. Issue not solved by 4.6.5.

lemig avatar Feb 03 '21 11:02 lemig

I am actually thinking there's a celery change that is involved in this. I don't think it's a kombu issue (at least in my case). For instance, I am looking at two applications with the following versions:

App1: Works

  • celery 4.4.6
  • kombu 4.6.11
  • redis 6.0.9

App2: Does not work

  • celery 4.4.7
  • kombu 4.6.11
  • redis 6.0.9

Notably, the major difference above is the minor version change in celery, but they're using the same kombu version. In addition, App2 used to work before we bumped our redis version. In fact, it still works with redis 5.0.7.

I've been trying to tweak settings and see if I can figure out what will make it ping again in 4.4.7:

WORKING APP1 (celery 4.4.6):

>>> my_celery_app.control.ping()
[{'celery@my-celery-app-6d8c66b688-p5tf2': {'ok': 'pong'}}]

Non-WORKING APP2 (celery 4.4.7):

>>> my_celery_app.control.ping()
[]

erewok avatar Feb 10 '21 22:02 erewok

it was closed by a pr. that was reverted later but we didn't reopen this issue.

auvipy avatar Feb 12 '21 05:02 auvipy

I am actually thinking there's a celery change that is involved in this. I don't think it's a kombu issue (at least in my case). For instance, I am looking at two applications with the following versions:

App1: Works

* celery 4.4.6

* kombu 4.6.11

* redis 6.0.9

App2: Does not work

* celery 4.4.7

* kombu 4.6.11

* redis 6.0.9

Notably, the major difference above is the minor version change in celery, but they're using the same kombu version. In addition, App2 used to work before we bumped our redis version. In fact, it still works with redis 5.0.7.

I've been trying to tweak settings and see if I can figure out what will make it ping again in 4.4.7:

WORKING APP1 (celery 4.4.6):

>>> my_celery_app.control.ping()
[{'celery@my-celery-app-6d8c66b688-p5tf2': {'ok': 'pong'}}]

Non-WORKING APP2 (celery 4.4.7):

>>> my_celery_app.control.ping()
[]

you should try celery 5.0.5 or master with kombu. did you find out the commit that is the root for not working in 4.4.7?

auvipy avatar Feb 12 '21 05:02 auvipy

I see the same behavior with celery 5.0.5. It seems odd that that (4.4.7) stack I mentioned above works on redis 5.0.7 but not on redis 6.0.9, all other things being equal. Should I open a ticket for celery proper? I'm happy to keep investigating: I looked at the diff between 4.4.6 and 4.4.7 yesterday and didn't see any telltale sign.

erewok avatar Feb 12 '21 14:02 erewok

I need to investigate more. I just tried a clean environment with git checkout v4.4.7 && pip install -e '.[redis]' and it worked with redis-server 6.0.10, so there's something else going on.

erewok avatar Feb 12 '21 14:02 erewok

If there are pending tasks in the queue before the update this is happening, need to figure out migrating queued tasks to a newer version.

If those tasks are not important just purge the queue and start the celery.

nanijnv1 avatar Jul 29 '21 13:07 nanijnv1