transformers
                                
                                 transformers copied to clipboard
                                
                                    transformers copied to clipboard
                            
                            
                            
                        LlamaAttention forward function type hint is incorrect
System Info
For the current version (4.52.4), in the LlamaAttention class, the type hint for the forward function
https://github.com/huggingface/transformers/blob/aa798b7ac9ff5018b3578eb927dc438671ab6a3e/src/transformers/models/llama/modeling_llama.py#L231
and what it actually returns
https://github.com/huggingface/transformers/blob/aa798b7ac9ff5018b3578eb927dc438671ab6a3e/src/transformers/models/llama/modeling_llama.py#L264
mismatch. Looking at the git blame, this is probably the result of the attention refactor introduced in #35235.
Who can help?
@ArthurZucker
Information
- [ ] The official example scripts
- [ ] My own modified scripts
Tasks
- [ ] An officially supported task in the examplesfolder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
Reproduction
Look at the code
Expected behavior
The type hint should be correct to avoid confusion