LLaMA-Factory
LLaMA-Factory copied to clipboard
Support llama3.2vl(WIP).
🚀 What does this PR do?
Support Llama-3.2-11B-Vision.
✅ Before submitting
- [x] Did you read the contributor guideline?
- [x] Did you write any new necessary tests?
🔗 Linked issues
#5549
⚠️ IMPORTANT
bitsandbytes 8 bits quantization is not functional. 4 bits is okay but not 8 bits.