diffusers
diffusers copied to clipboard
8007000E Not enough memory resources are available to complete this operation.
when I run the dml_onnx.py file in (amd_venv) C:\amd-stable-diffusion\difusers-dml\examples\inference>dml_onnx.py
I get an error like this: (amd_venv) C:\amd-stable-diffusion\difusers-dml\examples\inference>dml_onnx.py Fetching 19 files: 100%|███████████████████████████████████████████ | 19/19 [00:00<00:00, 1966.19it/s] 2022-10-10 10:05:28.0893026 [E:onnxruntime:, inference_session.cc:1484 onnxruntime::InferenceSession::Initialize::<lambda_70debc81dc7538bfc077b449cf61fe32>::operator()] Exception during initialization: D:\a_work\ 1\s\onnxruntime\core\providers\dml\DmlExecutionProvider\src\BucketizedBufferAllocator.cpp(122)\onnxruntime_pybind11_state.pyd!00007FFA82D145C8: (caller: 00007FFA8336D326) Exception(1) 800 7000E to enough memory resources tid(4790) complete this operation.
Traceback (most recent call last):
File "C:\amd-stable-diffusion\difusers-dml\examples\inference\dml_onnx.py", line 220, in
how to fix it ?
@muhademan,
Could you please copy-paste a reproducible code snippet here? I currently have sadly no idea what code you ran that produced this error and therefore cannot really help :-/
Same thing happens to me often. I have a RX560 4G and 16G DDR3 and I can use the ONNX pipeline but every so often I get that error when initializing the script which I have by following this guide: https://www.travelneil.com/stable-diffusion-windows-amd.html This is the code:
from diffusers import StableDiffusionOnnxPipeline pipe = StableDiffusionOnnxPipeline.from_pretrained("./stable_diffusion_onnx", provider="DmlExecutionProvider")
prompt = "A happy celebrating robot on a mountaintop, happy, landscape, dramatic lighting, art by artgerm greg rutkowski alphonse mucha, 4k uhd'"
image = pipe(prompt).images[0] image.save("output.png")
It might be worth mentioning that if I place the last 2 lines inside a while loop and the first image successfully starts to generate - then no matter how many times it loops I will not get that error. It only happens sometimes when first running the script - so my guess is this happens when loading the pipe with 'from_pretrained'. Its just weird to get a 'not enough memory' error that gets solved by running the script again without even closing any programs.
cc @anton-l here in case you have a hunch of what might be going on
Windows+AMD GPUs is a very unfamiliar territory for me, but maybe @pingzing or @harishanand95 could check it out :)
Hi, I'm @harishanand95's manager. There are critical limitations with the ONNXRuntime currently that make this inference path very sub-optimal. We're working hard to find much faster and leaner alternative solutions, but it's complicated and it takes time and effort. Thank you for your patience and I'm sorry you're running into these issues.
@muhademan how much Vram and System ram do you have? Generating a 512x768 image takes more than 8GB of Vram and 16GB of System Ram using Onnx and DmlExecutionProvider.
@GreenLandisaLie when you encounter the reported issue and open Task Manager what is the GPU VRAM and System RAM usage? I believe a 4GB Card w/16 GB of System ram would be near the bare minimum and would at times run into a out of memory error when initializing the pipe depending on what is running on your Windows System.
@anton-l I've heard that the diffusers process can be broken into separate fragments, loading only parts of the model when needed, reducing the memory requirements. Example: https://github.com/neonsecret/stable-diffusion Might be something that could help with the Memory issues that users report.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.