Results 2 comments of Alan Liu

For instance, if you wish to incorporate an 8-rank LoRA into the attention layer's 3 matrices (Q, K, V) within a model, you can utilize the following code: ```python lora_A...

@Andrei-Aksionov Yes. You can check my [note](https://github.com/clalanliu/LoRA_Notes/tree/main) >And if so why this approach is not used for the lora_A? There is no need to do so, because the input of...