wardensc2

Results 10 issues of wardensc2

### Is there an existing issue for this? - [X] I have searched the existing issues and checked the recent builds/commits ### What would your feature do ? Hi Illyasviel...

enhancement

### Is there an existing issue for this? - [X] I have searched the existing issues and checked the recent builds/commits ### What would your feature do ? Improve ~30%...

enhancement

### Checklist - [X] The issue exists after disabling all extensions - [X] The issue exists on a clean installation of webui - [ ] The issue is caused by...

### Is there an existing issue for this? - [X] I have searched the existing issues and checked the recent builds/commits ### What would your feature do ? Hi **Illyasviel**...

enhancement

Hi pkuliyi2015 with the new model Stable Diffusion XL which using more Vram and hardware your extension is vital for PC with less Vram more. But at the moment it's...

compatibility

Hi pkuliyi2015, someone name lifeisboringsoprogramming just make a loramark extension which help us to apply multi loras into any region we want and also can use hires fix to upscale,...

Hi @minuszoneAI Please upload GGUF model to hugging face. The link from china server is very slow to download. Thank you

### Feature Idea I use LLM node to create prompt inside Comfyui, however when loading LLM checkpoint, ComfyUI also loading Flux checkpoint which increate VRAM alot when I only generate...

Feature

I try to replace all the checkpoint except VAE to GGUF to reduce Vram however my PC still crash and need hard reset when enter vae decoding process. Not sure...

Hi @gokayfem When I try to load GGUF of Deepseek Model using your LLM loader, ComfyUI crash. Pleasee update the node to support us to use this model. Thank you...