stable-diffusion-webui-forge icon indicating copy to clipboard operation
stable-diffusion-webui-forge copied to clipboard

Fix for 5000 Nvidia cards

Open jlitz opened this issue 5 months ago • 1 comments

I got my 5060ti working with help from chatgpt. Here is the procedure:

TO USE 5060 AND OTHER 5K NVIDIA VIDEO CARDS TO CREATE IMAGES IN FORGE I HAVENT TRIED COMFY YET.

INSTALL FORGE IN C: MAKE FOLDER https://github.com/lllyasviel/stable-diffusion-webui-forge THIS WILL INSTALL FORGE AND PYTHON 10.6 NEEDED, BUT PYTORCH AND CUDA CHANGES NEEDED FOR 5000 NVIDIA CARDS

FOR FORGE SUPPORT MUST USE CU129

CU128 DOESNT SUPPORT ONLY GOES TO 90SM, 120SM REQUIRED IN FORGE THEREFORE USE 129 NIGHTLY AT https://pytorch.org/get-started/locally (USE NOTEPAD++ FOR EDITOR SO LINKS CAN BE CLICKED ON)

FORGE LOCATION: C:\FORGE\webui_forge_cu121_torch231\webui

INSTALLATION:: FIRST INSTALL ANACONDA MINI https://www.anaconda.com/download

CREATE ENVIRONMENT FOR FORGE USING ANACONDA PROMPT conda create -n forgeflux python=3.10 -y Then activate it: conda activate forgeflux pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu129

verify installation: python -c "import torch; print(torch.version); print(torch._C._cuda_getArchFlags())"

result must show 120sm (forgeflux) C:\Users\you>python -c "import torch; print(torch.version); print(torch._C._cuda_getArchFlags())" 2.9.0.dev20250801+cu129 sm_70 sm_75 sm_80 sm_86 sm_90 sm_100 sm_120

if not pip uninstall torch torchvision torchaudio -y and repeat 129 installation

also: cd C:\FORGE\webui_forge_cu121_torch231
pip install -r requirements.txt may be named requirement-revised remove revised part its in webui dir

CREATE BAT FILE - NEW FILE NOTEPAD ++ paste: @echo off cd /d C:\FORGE\webui_forge_cu121_torch231\webui C:\Users\jlitw\miniconda3\envs\forgeflux\python.exe webui.py

SAVE AS: RUN_CONDA BAT FILE AND SAVE IN C:\FORGE\webui_forge_cu121_torch231

CREATE DESKTOP SHORTCUT

or run From Anaconda Prompt:

conda activate forgeflux cd C:\FORGE\webui_forge_cu121_torch231 .\run_conda.bat

OTHER STUFF: -pip show torch -nvidia smi

CHECK INSTALLATIONS: python -c "import torch; print(torch.version); print(torch.cuda.is_available()); print(torch._C._cuda_getArchFlags())"

MAKE SURE ADDED TO ENVIRONMENT VARIABLES:

C:\Users\jlitw\miniconda3\condabin C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9 C:\FORGE\webui_forge_cu121_torch231\system\python

ADD TO webui-user --cuda-malloc SPEED IMPROVEMENT --no-half --precision full

TORCH CHECKS: print("GPU:", torch.cuda.get_device_name(0)) print("Capability:", torch.cuda.get_device_capability(0))

import torch print("GPU:", torch.cuda.get_device_name(0)) print("Capability:", torch.cuda.get_device_capability(0)) print("CUDA available:", torch.cuda.is_available())

LATEST PYTHON BUILD: Python 3.13.5 | packaged by conda-forge | (main, Jun 16 2025, 08:20:19) [MSC v.1943 64 bit (AMD64)] on win32

ALTERNATIVE BAT FILE TO ALTER VRAM @echo off title Launch Forge FluX (Optimized) call C:\Users\jlitw\miniconda3\condabin\conda.bat activate forgeflux cd /d C:\FORGE\webui_forge_cu121_torch231\webui

REM Run Forge with CUDA memory boost and low-precision options C:\Users\jlitw\miniconda3\envs\forgeflux\python.exe webui.py ^ --cuda-malloc ^ --no-half-vae ^ --medvram ^ --opt-sdp-attention

jlitz avatar Aug 03 '25 11:08 jlitz

Or just use the Pinokio app, it will do everything for you. Works like a charm for me.

BeavisLasVega avatar Aug 23 '25 19:08 BeavisLasVega