PEViT icon indicating copy to clipboard operation
PEViT copied to clipboard

Question about LoRA alpha

Open vishaal27 opened this issue 1 year ago • 1 comments

Hi, thanks for your great work. I noticed that in your scripts, you hard-coded the lora alpha to be 128 and the rank r to be 4 (therefore leading to a scaling factor of 32): https://github.com/eric-ai-lab/PEViT/blob/be6fb43ff54adeeffe720c663dd238976070558e/vision_benchmark/evaluation/lora_model.py#L455-L463 Was there a principled justification for these choices? I am just wondering if you did any tuning on these values to suggest what would be good values to use.

vishaal27 avatar Sep 10 '23 11:09 vishaal27

Hi, thanks for the interests! The setting is inherited from LoRA's official development code: https://github.com/microsoft/LoRA/tree/snapshot-9-15-2021 https://github.com/microsoft/LoRA/blob/snapshot-9-15-2021/src/model.py

jkooy avatar Sep 17 '23 18:09 jkooy