dagu
dagu copied to clipboard
webpage doesn't load dags when there are 100s of dags to fetch
Here, we are using dagu and each day 100's of new dags are being processed. So each time the webpage does API polling the webpage takes alot of time to parse the response.
My idea to overcome this problem:
- Advance query - such as it only loads last 5 days dags that are processed
- Web sockets for realtime updates
- Implement Client-Side Caching:
Web sockets sounds an ideal solution to me. Btw, would you mind if I ask a bit context why there're so many DAGs on your setup?
hey, so the use case is we use dagu for processing data which is coming in every 30 mins. We also reprocess some of the past data. Each day minimum of 42 rawdata gets processed. So until now we have processed 1174 dags. And while we do this, the web page just becomes unresponsive. So I was thinking of some elegant solution
I see, thanks for your clarification. As a quick solution, we can just cache the latest DAG status on the server, and then when there's any change, we can update the cache. This strategy is similar to the one used in the scheduler process, as seen here: https://github.com/hotaruswarm/dagu/blob/aade14ffce4dde136b57101925c188833f947b89/service/core/scheduler/entry_reader/entry_reader.go#L112-L155
can you please explain how to implement this? I didnt get it
Let’s see… I think we can implement a cache inside the rest server to store the latest status data for each DAG and add a small bit of logic in the agent. So when the agent process finishes running a DAG, it should clear the cache for that DAG, probably via some api.
Cache mechanism is now implemented in v1.13.0. Please give it a try.