[Bug]: Queuing task to parse documents when other already parsing Kbs are deleted
Is there an existing issue for the same bug?
- [x] I have checked the existing issues.
RAGFlow workspace code commit ID
...
RAGFlow image version
v0.16.0
Other environment information
I am working on an AWS virtual enviroment using ragflow via containers.
Actual behavior
When an already started to parse knowledge base is deleted and the parsing is not fully completed, the next documents you start to parse will be queued, assuming you have parse tasks left, even if you already deleted the knowledge base. My solution to this has been to restart my machine, but this has to be fixed.
Expected behavior
No response
Steps to reproduce
Start parsing documents in a knowledge base and delete it when some documents are still parsing, then create a new one and when you try to parse the new documents they will be queued.
Additional information
No response
That deleting KBs will stop all the involved parsing tasks has not been supported yet.
Okay, thank you for the response, I will handle it myself and I am willing to see this implemented in a near future, text me if so.
Having a similar issue. I had a knowledge base with 30k documents and indexing was running. I've decided the documents should be formatted differently so the knowledge base was deleted, but the system keeps trying to process the tasks. Logs messages look like these:
2025-06-06 22:00:28,166 WARNING 23 collect task 95d962f6428011f0afe0cad7592a290e is unknown 2025-06-06 22:00:28,172 WARNING 23 collect task 95e11bcc428011f0a801cad7592a290e is unknown 2025-06-06 22:00:28,259 WARNING 23 collect task 95e83556428011f0b5f1cad7592a290e is unknown
Considering there are 30k tasks, it will take a while to try them all and fail, before the system can index new/existing documents. I was expecting that deleting the knowledge base would also delete related tasks, that didn't happen.