MisterChief95
MisterChief95
Hello, I don't think Xformers supports Cuda 12.8. [The latest release is for Cuda 12.6](https://github.com/facebookresearch/xformers/releases/tag/v0.0.29.post3). If you're using a 50x0 Nvidia graphics card, the built in pytorch attention should work...
SageAttention Wheels: https://github.com/woct0rdho/SageAttention/releases/tag/v2.1.1-windows FlashAttention Wheels: https://github.com/kingbri1/flash-attention/releases/tag/v2.7.4.post1 Make sure to use the same torch/CUDA versions of the wheels as your Forge install. Now, Forge itself doesn't support SageAttn/FlashAttn. However, there is...
Please share the error if it happens again, that will help us figure out what to do :) You will want to uninstall torch 2.6 and use torch 2.7, and...
Triton is optional, but easy to install. Just go to the python directory and run this command: ```sh python.exe -m pip install triton-windows ``` This command assumes you're on torch...
Ah i see, its related to the git pull. Your python is fine - the git pull is something specific to the forge code. Run the command from the directory...
Glad you got it working! :)
Like nothing works at all? Like clicking different tabs and such? Regardless, if it works in other browsers then it's simply just a problem with LibreWolf and they would need...
> Got the error below, How can I fix it? > > CUDA error: no kernel image is available for execution on the device CUDA kernel errors might be asynchronously...
> PyTorch: 2.5.1+cu128 (nightly) I don't think there is a cu128 version of PyTorch 2.5.1. The version with 12.8 support is on the preview build that comes after 2.6.0. Was...
How much VRAM do you have, and what's the GPU weights setting at the top of the screen set to?