nunchaku
nunchaku copied to clipboard
[ICLR2025 Spotlight] SVDQuant: Absorbing Outliers by Low-Rank Components for 4-Bit Diffusion Models
I see int4 is available for Fill, Canny, Depth, etc.. https://huggingface.co/collections/mit-han-lab/svdquant-67493c2c2e62a1fc6e93f45c Please also provide for ControlNet-Union-Pro
lora link: https://huggingface.co/alimama-creative/FLUX.1-Turbo-Alpha/tree/main dev work with FLUX.1-Turbo-Alpha.lora  fill no work  Loading configuration from F:\ComfyUI\ComfyUI\models\diffusion_models\svdq-int4-flux.1-fill-dev\comfy_config.json model_type FLUX !!! Exception during processing !!! Traceback (most recent call last): File "F:\ComfyUI\ComfyUI\execution.py",...
Some acceleration techniques, such as [TeaCache](https://github.com/ali-vilab/TeaCache/tree/main/TeaCache4FLUX) rely on having access for inference to individual layers in the FLUX transformer. To make them interoperable with Nunchaku, the Nunchaku library should export...
Hi, can PuLID be used with svdquant (Nunckaku) in ComfyUI please? Thanks to let us know.
 fp4用controlnet union 确认是官方的fp16,但是一到采样器就会自动退出,感谢大佬解答支持
The comfyui execution background information is as follows: [START] Security scan [DONE] Security scan ## ComfyUI-Manager: installing dependencies done. ** ComfyUI startup time: 2025-04-08 22:21:53.837 ** Platform: Linux ** Python...
please help whith this, i tried reinstall nunchaku and deepcompressor, just the same error Using xformers attention in VAE VAE load device: cuda:0, offload device: cpu, dtype: torch.bfloat16 2025-04-07 01:58:30.960384...
I want to store the models in a specific folder instead of default huggingface cache. getting error with this code Windows 11 ```python transformer_repo = "mit-han-lab/svdq-int4-flux.1-dev" base_model_path = "models/Flux.1-dev" print("Here0")...
I implemented the Batch function. Could you please review the implementation? @ita9naiwa
