menyanski

Results 2 comments of menyanski

for those that succeed, what is you env parameter? i'm still experiencing this error: my build is based on, i have working xformers 0.30, flash-attn, and sageattention python 3.12, cu128;...

ah, yes... this requires "manual" install of cpp llama, its quite a roundabout, but i suggest get an ai (gemini/chatgpt) to guide you on how to safely install the required...