r0

Results 21 comments of r0

@BenjaminBossan please review the latest changes now. I believe I have addressed all your comments, but let me know if I missed something. I have added test cases where we...

@BenjaminBossan I have made the updates. As of now, we have 3 outstanding updates we need to resolve 1. What to do in case `target_modules` is a str: https://github.com/huggingface/peft/pull/2879#discussion_r2503550670: I...

1. https://github.com/huggingface/peft/pull/2879#discussion_r2518948684: Good suggestion. I have added a new function to find a layer by reference tensors 2. I have resolved the merge conflict and addressed your remaining comments 3....

@BenjaminBossan, I have fixed the test. I have removed 3 redundant tests which are no longer required.

@BenjaminBossan Resolved your comment

@BenjaminBossan I made a small commit - in one of my earlier commits, I had made a change where the target modules were saved as `model.embed_tokens` instead of `embed_tokens`. This...

@BenjaminBossan Let me know if any steps are remaining from my side for final push?

Hi @githubnemo, it would be very helpful if you could review the PR. One of our internal features depends on this :)

what's the current state of this implementation? I can see that vLLM docs already has it: https://docs.vllm.ai/en/stable/features/disagg_prefill.html but it doesn't seem to work with all the models. I tested it...