transformers
transformers copied to clipboard
Fix Whisper Positional Embeddings when using decoder context
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.
cc @ArthurZucker
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
is this getting in or really not needed?
Seems to work well without 😉 Also not sure if the updates on whisper fixed the original issue, would have to check!