stable-diffusion-webui
stable-diffusion-webui copied to clipboard
[Bug]: PyTorch doesn't reserve all of available VRAM
Is there an existing issue for this?
- [X] I have searched the existing issues and checked the recent builds/commits
What happened?
After switching to EndeavourOS (Arch Linux distro) from Windows, I noticed that PyTorch doesn't allocate all of available memory (even looking at nvdia-smi
there is still like ~500 MB of free VRAM), and if I compare reported usage of python3.10
process and reported reserved memory by PyTorch they don't match.
Steps to reproduce the problem
idk how to reproduce this
What should have happened?
Allocate all of available memory
Commit where the problem happens
0cc0ee1bcb4c24a8c9715f66cede06601bfc00c8
What platforms do you use to access the UI ?
Linux
What browsers do you use to access the UI ?
Google Chrome
Command Line Arguments
--medvram --no-half --precision full --xformers --api --theme dark
List of extensions
a1111-stable-diffusion-webui-vram-estimator
Console logs
https://pastebin.com/b3Gwn2XD
Additional information
On Windows 11 it was reserving up to 5.5 GBs or so. Here, however, only ~3.5 GBs
Why would expect it to reserve all available VRAM? It should only use what it needs.
Because it was doing it on Windows. Also I resolved this issue by downgrading to 1.12.1 with CUDA 11.6. Upgrading to PyTorch 2.1 with CUDA 11.8 also resolves this issue.