gemma-2B-10M icon indicating copy to clipboard operation
gemma-2B-10M copied to clipboard

LoRA fine tuning code ?

Open sandeep-krutrim opened this issue 8 months ago • 0 comments

Hi

Can this be finetuned with LoRA without any additional script. Also, during finetuning, if we take sequence length of 512 or 1k, will it affect the inference for higher context length of say 16k or 32k ?

sandeep-krutrim avatar Jun 14 '24 04:06 sandeep-krutrim