InvokeAI
InvokeAI copied to clipboard
[bug]: InvokeAI crash after changing Models
Is there an existing issue for this?
- [X] I have searched the existing issues
OS
Linux
GPU
cuda
VRAM
No response
What happened?
Hello, I have the problem that when I change the model in the webUI it just crashes and in the console I get the error that can be seen in the picture. Can someone help me please?
Screenshots
Additional context
No response
Contact Details
No response
That error message means that you're out of memory. Without knowing your GPU and VRAM, I can't tell if this is understandable/expected or not.
i have a Tesla T4 with 15109MiB of VRAM. i can load the basic Model and use it normaly but when i want to change the model it just crash.
I have over 15 GB free when i change the models.
I can confirm this, switching to the inpanting model sometimes causes the invoke to hang. Switching between models often seems to do the same thing too.
OS: Linux VGA: Nvidia 1060 6gb Ram: 16gb
I have the same issue, whole OS stutters for a few seconds before InvokeAI crashes completely.
Graphics: GTX 1080 Processor: i7-8700k Memory: 16GB OS: Nobara Linux 36 Windowing System: Wayland Gnome version: 42.4
@lstein Any ideas?
The whole model loading system is changing with the release of 2.3 and maybe this issue will be solved by that.
In the meantime, how much system RAM do you have? The problem may not be running out of VRAM, but running out of RAM when the system attempts to cache the model into CPU. You can test this behavior by launching invoke.py with --max_loaded_models=1
, which will prevent all caching.
There has been no activity in this issue for 14 days. If this issue is still being experienced, please reply with an updated confirmation that the issue is still being experienced with the latest release.