stable-diffusion-webui
stable-diffusion-webui copied to clipboard
[Bug]: Unable to use Xformer on RTX 4090
Is there an existing issue for this?
- [X] I have searched the existing issues and checked the recent builds/commits
What happened?
Whenever I attempted to use --xformers or using a prebuilt with the argument --force-enable-xformers it refuses to work. I am using an RTX 4090 so thats likely why but even when using the prebuilts from the https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/2449 thread it results in the same action.
The errors im seeing: I get a pop up error when attempting to load SD that says: "The procedure entry point ?matmil@at@@ya?AVTensor@1@AEBV21@0@Z could not be located in the dynamic link library D:\stable-diffusion-webui\venv\Lib\site-package\xformers_C.pyd"
In the CMD text after pressing ok to the error message i see this: "WARNING:root:WARNING: [WinError 127] The specified procedure could not be found Need to compile C++ extensions to get sparse attention suport. Please run python setup.py build develop"
I have tried following what the error message suggests which is the "run python setup.py build develop" but I still get the message every time. I followed this guide https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Xformers and it hasn't worked. I've rebuilt SD about 3 times now uninstalling old dependencies like Python, torch, etc to have a fresh start but it hasn't made a difference
Steps to reproduce the problem
1st Method When using a prebuilt I have the following arguments: --autolaunch --opt-channelslast --force-enable-xformers after placing the prebuilt WHL file in directory I open CMD and type: cd D:\stable-diffusion-webui D: .\venv\bin\activate pip install -U -I --no-deps https://github.com/C43H66N12O12S2/stable-diffusion-webui/releases/download/f/xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl
This method states its installed but results in the error mentioned
2nd Method When I attempt to build my own I've followed this guide https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Xformers the main exception being that I have CUDA 11.8 installed as I've read that its required for RTX 40XX cards
What should have happened?
Xformer should load in with its default message in the CMD line and give better generation rates.
Commit where the problem happens
30b1bcc64e67ad50c5d3af3a6fe1bd1e9553f34e
What platforms do you use to access UI ?
Windows
What browsers do you use to access the UI ?
Mozilla Firefox
Command Line Arguments
--autolaunch --opt-channelslast --force-enable-xformers
git pull
OR
--autolaunch --opt-channelslast --xformers
git pull
Additional information, context and logs
CMD startup log: warning: redirecting to https://github.com/AUTOMATIC1111/stable-diffusion-webui/ Already up to date. venv "D:\stable-diffusion-webui\venv\Scripts\Python.exe" Python 3.10.6 (tags/v3.10.6:9c7b4bd, Aug 1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)] Commit hash: 30b1bcc64e67ad50c5d3af3a6fe1bd1e9553f34e Installing requirements for Web UI Launching Web UI with arguments: --autolaunch --opt-channelslast --force-enable-xformers [WinError 127] The specified procedure could not be found WARNING:root:WARNING: [WinError 127] The specified procedure could not be found Need to compile C++ extensions to get sparse attention suport. Please run python setup.py build develop LatentDiffusion: Running in eps-prediction mode DiffusionWrapper has 859.52 M params. making attention of type 'vanilla' with 512 in_channels Working with z of shape (1, 4, 32, 32) = 4096 dimensions. making attention of type 'vanilla' with 512 in_channels Loading weights [4470c325] from D:\stable-diffusion-webui\models\Stable-diffusion\wd-v1-3-float32.ckpt Global Step: 683410 Applying xformers cross attention optimization. Model loaded. Loading hypernetwork anime [1, 2, 1] Activation function is None Weight initialization is Normal Layer norm is set to False Dropout usage is set to False Activate last layer is set to True Loaded a total of 0 textual inversion embeddings. Embeddings:
Check your pytorch version. Xformers prebuilt wheels does not support different pytorch versions, including different cuda versions Cuda 11.8 is fine. I also use it and built myself on top of torch 1.13+cu117 and python 3.9, no issues. You will see a warning but not an error.
Check your pytorch version. Xformers prebuilt wheels does not support different pytorch versions, including different cuda versions Cuda 11.8 is fine. I also use it and built myself on top of torch 1.13+cu117 and python 3.9, no issues. You will see a warning but not an error.
Any tips on how to build pytorch to support cuda 11.8?
You do not need to. I just installed pytorch built with 11.7 on my system, then built xformers using that pytorch and cuda 11.8 sdk.
Check your pytorch version. Xformers prebuilt wheels does not support different pytorch versions, including different cuda versions Cuda 11.8 is fine. I also use it and built myself on top of torch 1.13+cu117 and python 3.9, no issues. You will see a warning but not an error.
I forgot to add that in my original post but yes I've attempted it on Cu116, Cu117 on 1.13 and even on the nightly build with 1.17 as well. Even than it still gives me the same issue. Using the cmd commands from https://pytorch.org/get-started/locally/.
However upon using the python command import torch & print(torch.version)
I found it came back with 1.13+cpu as the version
I realized this was due to guide from the xformers page on this wiki which stated: To avoid issues with getting the CPU version, install pyTorch seperately:
pip install torch torchvision --extra-index-url https://download.pytorch.org/whl/cu113
I'm trying this with the build --extra-index-url https://download.pytorch.org/whl/nightly/cu117 instead to see what happens
Yes the extra url thing specifies which cuda version. The url ends with cu113 means you have torch with cuda 11.3. If you have cuda 11.8 on your pc, i recommand you to install torch with cu117 (cuda 11.7) for latest cuda updates. Simply change the url to end with cu117.
the extra url thing specifies which cuda version. The url ends with cu113 means you have torch with cuda 11.3. If you have cuda 11.8 on your pc, i recommand you to install torch with cu117 (cuda 11.7) for latest cuda updates. Simply change the url to end with cu117.
I've forced to be the correct by deleting the files located in appdata for pytorch in python as using pip uninstall torch wasnt seeming to cut it. now i get:
I tried to launch webui once more and still get the same message as the original post about a web127 error. I attempted to reapply the whl file but get:
Where do i apply the --force-reinstall? in the arguments line? because i've tried this but it states that the launch.py doesn't recognize
pip uninstall xformers then pip install the newly built wheel again (you have to build yourself because the one provided by this repo is on torch 1.12.1 and it wont work with 1.13)
the force reinstall is meant for pip command, but you should not do that because it will force install torch CPU
pip uninstall xformers then pip install the newly built wheel again (you have to build yourself because the one provided by this repo is on torch 1.12.1 and it wont work with 1.13)
I see. Ok, thatd make sense. Ill give it a shot soon. Thanks
so i followed the steps from https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Xformers and it generates a file but its only like 300kb and doesnt seem right. Should I be allowing xformers to create a directory within SD or should i have it under a separate folder entirely?
EDIT* I've tried generating xformers elsewhere and its not letting it fully generate. Also i've noticed that when it is generating its fully from a cached version of the files which might be the problem.
Yes, you should be getting ard 88MB .whl file. You do need to clear all cache files. Whether its in SD dir or not does not matter. As long as you are in the xformers repo.
So I uninstalled/reinstalled all the programs once more and deleted the app data folders as well but when i run pip install -r requirements.txt it still says:
"(venv) D:\Stable Diffusion\stable-diffusion-webui\xformers>pip install -r requirements.txt Collecting torch>=1.12 Using cached torch-1.13.0-cp310-cp310-win_amd64.whl (167.3 MB) Collecting numpy Using cached numpy-1.23.4-cp310-cp310-win_amd64.whl (14.6 MB) Collecting pyre-extensions==0.0.23 Using cached pyre_extensions-0.0.23-py3-none-any.whl (11 kB) Collecting typing-inspect Using cached typing_inspect-0.8.0-py3-none-any.whl (8.7 kB) Collecting typing-extensions Using cached typing_extensions-4.4.0-py3-none-any.whl (26 kB) Collecting mypy-extensions>=0.3.0 Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB) Installing collected packages: mypy-extensions, typing-extensions, numpy, typing-inspect, torch, pyre-extensions Successfully installed mypy-extensions-0.4.3 numpy-1.23.4 pyre-extensions-0.0.23 torch-1.13.0 typing-extensions-4.4.0 typing-inspect-0.8.0"
Is there another location where the cached files would be?
EDIT Found where it is with the cmd command prompt: pip cache dir
WIll give it another try and see what happens this time
Sorry for being unclear, what I meant (and what I thought you meant) was that the cache files located in build/ folder in xformers folder must be cleared. Those are cache from building xformer wheel.
As for the requirements.txt, no, this has nothing to do with cache. You are installing the pytorch cpu version from requirement txt, then you cannot build xformer with gpu support. You got a 300kb file probably because you built xformers for cpu only, since adding gpu support will increase file size a lot. This is why the wiki ask you to install gpu pytorch first before installing requirement.txt so that the existing gpu pytorch dont get overwritten.
For a fast way to clear pip cache, run pip cache purge. However as I said above, you do not need to do this, and doing this will not fix any problem.
Sorry for being unclear, what I meant (and what I thought you meant) was that the cache files located in build/ folder in xformers folder must be cleared. Those are cache from building xformer wheel.
As for the requirements.txt, no, this has nothing to do with cache. You are installing the pytorch cpu version from requirement txt, then you cannot build xformer with gpu support. You got a 300kb file probably because you built xformers for cpu only, since adding gpu support will increase file size a lot. This is why the wiki ask you to install gpu pytorch first before installing requirement.txt so that the existing gpu pytorch dont get overwritten.
For a fast way to clear pip cache, run pip cache purge. However as I said above, you do not need to do this, and doing this will not fix any problem.
i see ok. I felt like i did do that prior this time but it still generated incorrectly. I'll try it once more
thanks for all the help. I'm a complete scrub when it comes to python/cmd prompts
So i was able to generate the file. xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl came in at 90mb. However its generated a different error saying xformer.ops wasn't available. some similar to the affect of:
"Cannot import xformers Traceback (most recent call last): File "F:\NovelAI\Clone\stable-diffusion-webui\modules\sd_hijack_optimizations.py", line 15, in
I'll cover the step by step of what I tried for the install process:
- Uninstall and reinstall python 3.10,6/Git
- Right click and select git bash here into my D: drive
- stable diffusion-webui directory is created
- open cmd and changed directory to webui install
- type the prompt git clone https://github.com/facebookresearch/xformers.git
- once download is complete I follow these steps: git clone https://github.com/facebookresearch/xformers.git cd xformers git submodule update --init --recursive python -m venv venv .\venv\scripts\activate pip3 install --pre torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/nightly/cu117 pip install -r requirements.txt pip install wheel set NVCC_FLAGS=-allow-unsupported-compiler *Ninja-win is already installed in the windows folder python setup.py build python setup.py bdist_wheel
- wait for the whl file to complete generation
- install whl file in the webui directory with these prompts: .\venv\scripts\activate pip install xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl
The main two things that could be causing it that sure of are: 1: Does the webui bat need to be opened first to create the repository/venv folders before doing the xformer part (I didnt do this because the directions dont outright say its necessary and it also installs pytorch files so I thought itd be best to wait first) 2: Do i install the whl file into the webui directory or the xformer directory as reading the steps from the wiki I understood it as needing to be changed to the webui directory to complete it "In xformers directory, navigate to the dist folder and copy the .whl file to the base directory of stable-diffusion-webui In stable-diffusion-webui directory, install the .whl, change the name of the file in the command below if the name is different: ./venv/scripts/activate pip install xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl"
I believe i got it fixed. Updating the launch.py with "pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu117" as its replacement worked and opening up webui after doing so to let it install the reposit seems to be the way to go about it. Loaded with no errors this time.
yeah everythings good to go. thanks for the help @aliencaocao
So i was able to generate the file. xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl came in at 90mb. However its generated a different error saying xformer.ops wasn't available. some similar to the affect of:
"Cannot import xformers Traceback (most recent call last): File "F:\NovelAI\Clone\stable-diffusion-webui\modules\sd_hijack_optimizations.py", line 15, in import xformers.ops ModuleNotFoundError: No module named 'xformers.ops'" (I unfortunately cannot see the exact error anymore as it went back to the win127 error after I tried a few things to see if I could resolve it based on what i found online. its very similar to what i remember seeing except i believe it was line 16 instead of 15.)
I'll cover the step by step of what I tried for the install process:
- Uninstall and reinstall python 3.10,6/Git
- Right click and select git bash here into my D: drive
- stable diffusion-webui directory is created
- open cmd and changed directory to webui install
- type the prompt git clone https://github.com/facebookresearch/xformers.git
- once download is complete I follow these steps: git clone https://github.com/facebookresearch/xformers.git cd xformers git submodule update --init --recursive python -m venv venv .\venv\scripts\activate pip3 install --pre torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/nightly/cu117 pip install -r requirements.txt pip install wheel set NVCC_FLAGS=-allow-unsupported-compiler *Ninja-win is already installed in the windows folder python setup.py build python setup.py bdist_wheel
- wait for the whl file to complete generation
- install whl file in the webui directory with these prompts: .\venv\scripts\activate pip install xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl
The main two things that could be causing it that sure of are: 1: Does the webui bat need to be opened first to create the repository/venv folders before doing the xformer part (I didnt do this because the directions dont outright say its necessary and it also installs pytorch files so I thought itd be best to wait first) 2: Do i install the whl file into the webui directory or the xformer directory as reading the steps from the wiki I understood it as needing to be changed to the webui directory to complete it "In xformers directory, navigate to the dist folder and copy the .whl file to the base directory of stable-diffusion-webui In stable-diffusion-webui directory, install the .whl, change the name of the file in the command below if the name is different: ./venv/scripts/activate pip install xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl"
Tring to follow along with what you did here and replacing step 6's pip3 install with the new line. coming up with a ton of errors.
C:\Users\dusty\Desktop\Stable-Diffusion-2\xformers\venv\lib\site-packages\torch\utils\cpp_extension.py:358: UserWarning: Error checking compiler version for cl: [WinError 2] The system cannot find the file specified warnings.warn(f'Error checking compiler version for {compiler}: {error}') building 'xformers._C' extension
build\lib.win-amd64-cpython-310\xformers_C.pyd : fatal error LNK1120: 2 unresolved externals error: command 'C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.34.31933\bin\HostX86\x64\link.exe' failed with exit code 1120
I'm lost enough that I don't even know what's wrong. Only figured out what ninja build was by googling and stumbling back into the original xformers instructions.
So i was able to generate the file. xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl came in at 90mb. However its generated a different error saying xformer.ops wasn't available. some similar to the affect of: "Cannot import xformers Traceback (most recent call last): File "F:\NovelAI\Clone\stable-diffusion-webui\modules\sd_hijack_optimizations.py", line 15, in import xformers.ops ModuleNotFoundError: No module named 'xformers.ops'" (I unfortunately cannot see the exact error anymore as it went back to the win127 error after I tried a few things to see if I could resolve it based on what i found online. its very similar to what i remember seeing except i believe it was line 16 instead of 15.) I'll cover the step by step of what I tried for the install process:
- Uninstall and reinstall python 3.10,6/Git
- Right click and select git bash here into my D: drive
- stable diffusion-webui directory is created
- open cmd and changed directory to webui install
- type the prompt git clone https://github.com/facebookresearch/xformers.git
- once download is complete I follow these steps: git clone https://github.com/facebookresearch/xformers.git cd xformers git submodule update --init --recursive python -m venv venv .\venv\scripts\activate pip3 install --pre torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/nightly/cu117 pip install -r requirements.txt pip install wheel set NVCC_FLAGS=-allow-unsupported-compiler *Ninja-win is already installed in the windows folder python setup.py build python setup.py bdist_wheel
- wait for the whl file to complete generation
- install whl file in the webui directory with these prompts: .\venv\scripts\activate pip install xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl
The main two things that could be causing it that sure of are: 1: Does the webui bat need to be opened first to create the repository/venv folders before doing the xformer part (I didnt do this because the directions dont outright say its necessary and it also installs pytorch files so I thought itd be best to wait first) 2: Do i install the whl file into the webui directory or the xformer directory as reading the steps from the wiki I understood it as needing to be changed to the webui directory to complete it "In xformers directory, navigate to the dist folder and copy the .whl file to the base directory of stable-diffusion-webui In stable-diffusion-webui directory, install the .whl, change the name of the file in the command below if the name is different: ./venv/scripts/activate pip install xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl"
Tring to follow along with what you did here and replacing step 6's pip3 install with the new line. coming up with a ton of errors.
C:\Users\dusty\Desktop\Stable-Diffusion-2\xformers\venv\lib\site-packages\torch\utils\cpp_extension.py:358: UserWarning: Error checking compiler version for cl: [WinError 2] The system cannot find the file specified warnings.warn(f'Error checking compiler version for {compiler}: {error}') building 'xformers._C' extension
build\lib.win-amd64-cpython-310\xformers_C.pyd : fatal error LNK1120: 2 unresolved externals error: command 'C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.34.31933\bin\HostX86\x64\link.exe' failed with exit code 1120
I'm lost enough that I don't even know what's wrong. Only figured out what ninja build was by googling and stumbling back into the original xformers instructions.
Those instructions were giving a step by step of what I did when i was still having problems. Its not the correct solution. What I did was follow instructions here https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Xformers
Ultimately I did this: (this is assuming you have all the correct installs for the mentioned wiki. You need VS Tools 2022, Cuda 11.8, python and git Cleared out installs of python and any current SD install. reinstalled python 3.10.6 git bash'd the stable diffusion build go to the webui directory modified the launch.py line to pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu117 (if you search pip install torch it should have cu113 in there. Change that to cu117) launch webui to let it generate venv and such. source ./venv/bin/activate cd repositories git clone https://github.com/facebookresearch/xformers.git cd xformers git submodule update --init --recursive python -m venv venv ./venv/scripts/activate pip install torch torchvision --extra-index-url https://download.pytorch.org/whl/cu117 pip install -r requirements.txt pip install wheel OPTIONAL download and install https://github.com/ninja-build/ninja/releases (Place ninja.exe under C:\Windows. Makes the build faster) python setup.py build python setup.py bdist_wheel Once done generating you go to the dist folder and copy the generated file (xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl) to the main webui directory. * I started a new CMD when doing this step ./venv/scripts/activate pip install xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl Lastly editing the stable-diffusion-webui with --force-enable-xformers in the arguments section
Now it should it work
The only thing that worked for me was to uninstall VS2022. Compiled well with VS2019 (and tools accompanying)
I think I found easier solution. I get this error after updating xformers.
This is what fix it for me:
Go inside this folder:
C:\Programs\Automatic1111\venv\Scripts
Run cmd and then uninstall and install xformers:
pip uninstall xformers
pip install -U -I --no-deps https://github.com/C43H66N12O12S2/stable-diffusion-webui/releases/download/torch13/xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl
I think I found easier solution. I get this error after updating xformers.
This is what fix it for me:
Go inside this folder:
C:\Programs\Automatic1111\venv\Scripts
Run cmd and then uninstall and install xformers:
pip uninstall xformers
pip install -U -I --no-deps https://github.com/C43H66N12O12S2/stable-diffusion-webui/releases/download/torch13/xformers-0.0.14.dev0-cp310-cp310-win_amd64.whl
This worked for me but didn't increase the it/s at all