low-rank-adaptation topic
List
low-rank-adaptation repositories
pytorch-lora
58
Stars
15
Forks
Watchers
LORA: Low-Rank Adaptation of Large Language Models implemented using PyTorch
LLaMA-8bit-LoRA
146
Stars
11
Forks
Watchers
Repository for Chat LLaMA - training a LoRA for the LLaMA (1 or 2) models on HuggingFace with 8-bit or 4-bit quantization. Research only.
Galore-pytorch
20
Stars
1
Forks
Watchers
GaLore: Memory-Efficient LLM Training by Gradient Low-Rank Projection
MOELoRA-peft
90
Stars
9
Forks
Watchers
[SIGIR'24] The official implementation code of MOELoRA.