PEViT icon indicating copy to clipboard operation
PEViT copied to clipboard

Questions about KAdaptation implementation

Open vishaal27 opened this issue 1 year ago • 1 comments

Hi, thanks for the great work and releasing the code to reproduce it.

I have a few questions regarding the kronecker adaptation forward pass through the adapter modules:

(1) The scaling factor you use for the KAdaptation is 1/5 times the scaling used in standard LoRA: https://github.com/eric-ai-lab/PEViT/blob/be6fb43ff54adeeffe720c663dd238976070558e/vision_benchmark/evaluation/model.py#L564 Is there a justification for this or is it simply an empirical magic number?

(2) While forwarding through your adapter for the value matrix, it seems like you reuse the query weight matrix (A as defined in the paper as I understand it). Is this a typo/bug? https://github.com/eric-ai-lab/PEViT/blob/be6fb43ff54adeeffe720c663dd238976070558e/vision_benchmark/evaluation/model.py#L571-L580 Shouldn't line 580 be H = kronecker_product_einsum_batched(phm_rule2, Wv).sum(0) instead?

vishaal27 avatar Sep 10 '23 18:09 vishaal27

Hi, many thanks for interests! The scaling factor is a hyper-parameter, you can manually adjust it but from my experience it won't affect the performance much. For the value matrix, actually we share the same decomposition here so that's why reusing it.

jkooy avatar Sep 17 '23 18:09 jkooy