flower
flower copied to clipboard
Celery FLower not using backend results
Hi guys,
i am trying to setup flower to monitor my results. I am starting the whole thing with the command celery -A tasks flower --persistent=True
(tried without persistent too) and i've read that it uses an event system. I can see the generated tasks, although i have redis as backend configured
app = Celery('tasks', broker='pyamqp://guest:[email protected]', backend='redis://172.17.0.3')
it is still using it's own flower file. Why? How can i tell flower to use the redis backend as source to show the history of the tasks?
Does that mean i can not read with flower, the tasks that are stored in my redis backend?
It is similar to my issue. When restarting flower I got an empty dashboard although I use --persistent=True
key.
Did you specify the port and vhost for redis?
e.g. redis://backend:6379/0
Yes, I am seeing this issue too at my end. I have specified both CELERY_BROKER_URL=redis://redis:6379/0 and CELERY_RESULT_BACKEND=redis://redis:6379/0 and I can see in the Redis backend, the relevant keys get stored. However, the flower UI does not seems to fetch the data from the Redis backend after docker-compose down followed by a docker-compose up. If I create a new Celery task, that gets shown in the Flower UI, and it again gets lost after the restart.
Update:
I see in flower docs that we can enable persistence at Flower end independent of Celery's result backend. The following config (docker-compose.yml snippet) of flower works for me to store the data at flower end and retain the data even after service restarts. So, the flower dashboard will show the historical tasks.
flower:
image: mher/flower
container_name: flower
environment:
- CELERY_BROKER_URL=redis://redis:6379/0
- FLOWER_PORT=5555
- FLOWER_PERSISTENT=True
- FLOWER_STATE_SAVE_INTERVAL=10000
- FLOWER_DB=/etc/db/flower.db
ports:
- "5555:5555"
volumes:
- ./flower/storage:/etc/db/
depends_on:
- redis
Seeing the same issue on my end as well. I've configured Redis as my result backend for Celery as below:
celery_app = Celery(
"foo",
backend=settings.CELERY_BACKEND_URL,
broker=settings.CELERY_BROKER_URL,
include=["app.worker"]
)
When starting my Celery worker, I can see in the logs that it picks up the result backend and writes to it:
-------------- [email protected] v5.2.7 (dawn-chorus)
--- ***** -----
-- ******* ---- macOS-13.0-arm64-arm-64bit 2022-11-02 21:42:01
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: foo:0x105d8db10
- ** ---------- .> transport: amqp://foo:**@localhost:5672//
- ** ---------- .> results: redis://localhost:6379/0
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
And on Redis side:
redis-cli
127.0.0.1:6379> KEYS *
1) "celery-task-meta-bd370fe9-63e7-412b-a81d-80e92267deab"
2) "celery-task-meta-637901fb-f403-4458-8f3d-e2ed33a67b13"
3) "celery-task-meta-9cb71514-47a0-47ef-a155-dbd5f991d518"
127.0.0.1:6379>
However, when starting flower
, even with a hardcoded --result-backend
parameter, it doesn't get picked up:
celery -A app.worker.main --result-backend redis://localhost:6379/0 flower
[I 221102 21:48:09 command:162] Visit me at http://localhost:5555
[I 221102 21:48:09 command:170] Broker: amqp://foo:**@localhost:5672//
[I 221102 21:48:09 command:171] Registered tasks:
And in Flower UI I don't see any of the previous tasks that registered in the Redis backend before Flower was started up
Duplicate of https://github.com/mher/flower/issues/542