transformers
                                
                                 transformers copied to clipboard
                                
                                    transformers copied to clipboard
                            
                            
                            
                        fix the description of token used for Bart classification
What does this PR do?
Fixes the description of token used for Bart classification. It uses last EOS token and not a special pooled output.
https://github.com/huggingface/transformers/blob/7a8118947f3c6a802a9f63dc22c394961d38860f/src/transformers/models/bart/modeling_bart.py#L1520-L1527
Before submitting
- [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the contributor guideline, Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the forum? Please add a link to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the documentation guidelines, and here are tips on formatting docstrings.
- [ ] Did you write any new necessary tests?
@patrickvonplaten, @patil-suraj
The documentation is not available anymore as the PR was closed or merged.
@patrickvonplaten - does this look fine to you?
@ArthurZucker @gante could you take a look here?
@ArthurZucker @JoaoLages could you take a look here?
Maybe you wanted to call João Gante and not me, but here are my 2 cents anyway 😄 : the description that @ekagra-ranjan wrote is correct but the previous description was not incorrect either.
Taking the last EOS token embedding from the model's output is a type of pooling. For example, in BERT we take the first token instead, that corresponds to the CLS token  and we also have the same description for BertForSequenceClassification.
The previous description is simpler, more general and it is not incorrect. Not against having more descriptive docstrings, but then it would make sense to review all the (...)ForSequenceClassification classes, not only BART.
Sorry @JoaoLages :sweat_smile: ! You are right indeed
@JoaoLages I see your point and I guess you are right! Thanks for sharing your thoughts.