Mahdi Beitollahi

Results 2 issues of Mahdi Beitollahi

Based on #1655 Adds a use_wlora config to `LoraLayer` that allows learning the combination weights (i.e. `wlora_weights) of pre-trained LoRAs.

### Feature request PEFT can combine pre-trained LoRA modules by averaging them or providing custom weights for weighted averaging. This [paper ](https://arxiv.org/abs/2402.15414) showed that learning these weights is better than...