Ayush Singh
Ayush Singh
Hello! I noticed a small mismatch in the return type of the LlamaAttention forward method in Transformers version 4.52.4. It does not match what is actually being returned. This might...
@Rocketknight1 i have done the changes...I am pushing the changes Plz review
#38795 @Rocketknight1
> Hi @ArkVex can you run `make fix-copies` to propagate that change to other models that are copying from Llama? That should make the CI pass! I didnt get that...could...
> Hi @ArkVex, if you look at the tests on this PR, "check_repository_consistency" is failing. The reason is that some other models copy from Llama, and those copies don't match...
 I did the test...plz check
@Rocketknight1 you there?
@vanpelt @dxoigmn @tmm1 anyone plz review