Benjamin Bossan

Results 584 comments of Benjamin Bossan

@mnoukhov Let me know when the PR is ready for review.

> * the user passing in the `base_model_revision` to `get_peft_config` or `PeftModel.from_pretrained` or > * adding `revision` to `PretrainedConfig` in transformers I don't think that the latter would make sense....

Thanks for discovering and fixing that issue. I created a LoRA adapter with and without revision: ```python model = AutoModelForCausalLM.from_pretrained("hf-internal-testing/tiny-random-BertModel").eval() # without revision model = PeftModel.from_pretrained(model, "peft-internal-testing/tiny-random-BertModel-lora") # with revision...

> The only issue I can forsee is if the default branch name changes in the future i.e. github changing from `master` to `main`. It would be annoying to deal...

@mnoukhov PEFT release is out, merging this now. Again, thanks for the PR.

@raven38 Thanks for the updates, I'm currently reviewing the PR, but it'll take some time. Meanwhile, could you please run `make style` on your code so that the CI passes?

Hmm, code quality checks still fail: > ruff src tests examples docs scripts docker ruff format --check src tests examples docs scripts docker Would reformat: src/peft/tuners/reft/config.py Would reformat: src/peft/tuners/reft/layer.py Would...

@frankaging Thanks a lot for taking a look. Regarding your questions, I'll let @raven38 provide the final answer. Personally, I think 2 is right and I don't quite understand 1,...

> Since for LoReFT, we want to target the residual stream (transformer block/layer output), what should we put for the `target_modules` in that case? Thanks! I see, I think I...

> Thanks for your reply. Yes, we need to target the whole block module itself (e.g., `model.layers[15]`). Will this be doable? Yes, I don't see why not. We would have...