elementary-particle
elementary-particle
Are we going to complete this rework eventually? This is a lot of work indeed, we should not leave them around.
Sure, a simple example is to create a LoRA adapter for a local base model and saving it. for example, create a `PeftModel` for a local snapshot of `mistralai/Mistral-7B-v0.1`, and...
```python from transformers import AutoModelForCausalLM from peft import LoraConfig, PeftModel from peft import prepare_model_for_kbit_training, get_peft_model local_dir = 'path/to/model' base_model = AutoModelForCausalLM.from_pretrained(local_dir) peft_config = LoraConfig( lora_alpha=16, lora_dropout=0.1, r=64, bias="none", task_type="CAUSAL_LM", target_modules=["q_proj",...
Is this sufficient? @BenjaminBossan
Certainly, I followed the syntax in `testing_common.py` and created a test unit for the issue. Maybe some further checks? @BenjaminBossan
The test case is fixed as advised and comments are added to explain the issue. Please review the changes, thanks.
Thanks for keeping up with this PR. The merge conflict is resolved.