yeoedward
yeoedward
Took a stab at it!
Fixed `black` formatting and merge conflicts.
CI test failures were caused by Llama dependencies not being available in earlier versions of the `transformers` library. To ensure backwards compatibility, I've done the following: 1. Added import guards...
Rebased on main and added support for multi-adapters
Thanks @pacman100 for the review! Regarding your comment about not using `_modified_forward()`: > 3. To make the implementation generic, having attention_target_modules as another config which is a List[nn.Module]. The AdaptedAttention...
@pacman100 I've made the changes you suggested and rebased on main.
@winglian I have validated the implementation on smaller problems that run on a CPU but haven't tried reproducing the paper's results at full problem size. When you say it doesn't...
@pacman100 I've made the suggested changes and rebased on main.
@pacman100 I couldn't reproduce the style error (which was in a file unrelated to this PR), but I rebased on `main` and fixed some conflicts, which perhaps was the reason...