Queuing prompts is slow and gets slower as the queue grows.
To reproduce:
- Have a workflow. It does not matter what it is, so long as it changes between runs.
- Set batch size to 100 and click "queue prompt".
- Repeat.
Expected result:
Adding items to the front or back of a queue should be a linear-time operation, so the time required to queue items should not be affected by the size of the queue.
Actual result:
Time required to queue items grows with the length of the queue. When the queue is empty, queuing 100 items takes about 7 seconds. When the queue contains 500 items, adding another 100 takes 53 seconds. When the queue contains 1000 items, adding another 100 takes 1 minute 47 seconds.
Analysis:
This seems to be partly caused by the use of a heapq rather than a deque (which would more closely match how the queue is actually used and has linear-time operations.) While this explains why the time grows, it can't explain why it is so slow in the first place: there is no way that adding 100 items to a heapq with 1000 items in it should take a minute and a half.
Oh this is probably because the frontend does a request for the queue contents after you press queue so as it becomes larger and larger it has more and more trouble parsing the huge json.
Had just come to the same conclusion myself - it must be client side. The python side seems fine. Queuing up items should be log(N) and you only compare the first item (int) in a tuple, so it should be really fast.
This is still a problem. I routinely queue up a few hundred items to the queue when I step away to let it run, and once I start adding anything beyond 50 items it gets noticeably slower with each additional item.