ai-toolkit icon indicating copy to clipboard operation
ai-toolkit copied to clipboard

qwen nunchaku support

Open zwukong opened this issue 2 months ago • 1 comments

Thanks for your work for letting qwen traing in less than 4G vram.Qwen image and edit all have nunchaku support,and it is really fast and vram friendly. I wonder if the training works on nunchaku int4 models

zwukong avatar Oct 08 '25 07:10 zwukong

You generally want to train in the original accuracy then quantize it later. That being said, the Nunchaku quantization doesn't seem to perform well below 80GB of VRAM without a lot of tweaking, but it still took me hours to quantize a model with 32GB of VRAM. Lower bit training is still pretty new, and I've only really seen it on some newer LLMs, but not any diffusion models yet.

rwfsmith avatar Nov 01 '25 23:11 rwfsmith