OneLLM icon indicating copy to clipboard operation
OneLLM copied to clipboard

Training code

Open Yanllan opened this issue 1 year ago • 4 comments

Hello! Your work is excellent and I am also very interested, I wonder when you can open source the training code or give some examples, thanks!

Yanllan avatar Mar 01 '24 09:03 Yanllan

We will release the training code within one month.

csuhan avatar Mar 02 '24 03:03 csuhan

Hi @Yanllan , we have just released the training code. Feel free to tell us if you need any help.

csuhan avatar Mar 08 '24 07:03 csuhan

First of all, congratulations on being accepted by CVPR! Secondly, due to graphics card limitations, do you have code reference for LORA fine-tuning? I only have an A800.

Yanllan avatar Mar 10 '24 07:03 Yanllan

We have implement LoRA tuning for pure LLaMA at: https://github.com/Alpha-VLLM/LLaMA2-Accessory/blob/main/accessory/model/LLM/llama_peft.py

You can 1. add Lora layers to onellm.py, and 2. freeze LLM and turn on lora layers in its __init__ function.

csuhan avatar Mar 27 '24 11:03 csuhan