InvokeAI icon indicating copy to clipboard operation
InvokeAI copied to clipboard

[bug]: Models constantly unload after every generation

Open abes200 opened this issue 4 months ago • 8 comments

Is there an existing issue for this problem?

  • [x] I have searched the existing issues

Operating system

Windows

GPU vendor

Nvidia (CUDA)

GPU model

No response

GPU VRAM

No response

Version number

5.15

Browser

Firefox

Python dependencies

No response

What happened

Models are immediately removed from ram after every generation. This includes during batch generations or performing multiple iterations. So if I have a queue of 10 images to be generated the models are loaded and unloaded 10 times. I have tried setting lazy_offload to true with no effect. As far as I can tell there is no setting or option to stop this from happening.

What you expected to happen

I would expect the model to stay in memory for at least a short time. Or at the very least during a batch run of images. It slows down the process of generating multiple images. It should also be a feature that is on by default.

How to reproduce the problem

No response

Additional context

No response

Discord username

No response

abes200 avatar Jul 12 '25 04:07 abes200