nunchaku icon indicating copy to clipboard operation
nunchaku copied to clipboard

[ICLR2025 Spotlight] SVDQuant: Absorbing Outliers by Low-Rank Components for 4-Bit Diffusion Models

Results 140 nunchaku issues
Sort by recently updated
recently updated
newest added

### Checklist - [x] 1. I have searched for related issues and FAQs (https://github.com/mit-han-lab/nunchaku/discussions/262) but was unable to find a solution. - [x] 2. The issue persists in the latest...

bug
lora

0.0 seconds (IMPORT FAILED): /root/ComfyUI/custom_nodes/ComfyUI-nunchaku 环境都已经安装完成 linux python=3.10.15, torch2.5, 使用 Nunchaku-0.2.0 %2Btorch2.5-cp310-cp310-linux_x86_64.whl 安装了 nunchaku,但插件导入仍然失败

Hello everyone, As promised, last month we brought multiple-LoRA and ControlNet-Union-Pro support with faster generation speed. Additionally, we expanded support for 20-series GPUs. We understand some of you may still...

Previously, the following would not properly clear all memory: ```py import gc, torch del pipeline, transformer torch.cuda.empty_cache() ``` This PR fixes that by resetting the internal quantized model state on...

Fixes https://github.com/mit-han-lab/nunchaku/issues/233 This PR exposes the `norm1` layer from the transformer blocks, which is used by TeaCache. In this way, SVDQuant and TeaCache can be combined to get an even...

### Checklist - [x] 1. If the issue you raised is not a feature but a question, please raise a discussion at https://github.com/mit-han-lab/nunchaku/discussions/new/choose. Otherwise, it will be closed. - [x]...

enhancement
faq

I successfully installed it but still encountered the same issue as before. I can guarantee that this model works in the normal dev mode and performs better than the original...

lora

### Checklist - [ ] 1. If the issue you raised is not a feature but a question, please raise a discussion at https://github.com/mit-han-lab/nunchaku/discussions/new/choose. Otherwise, it will be closed. -...

enhancement

### Motivation https://github.com/bytedance/UNO ### Related resources _No response_

enhancement