transformers
transformers copied to clipboard
implement unlimiformer into transformers
Feature request
https://github.com/abertsch72/unlimiformer promises to support unlimited input length on any transformer based encoder/decoder model with sub-linear cost in time.
Motivation
Context lengths are fairly limited
Your contribution
Testing
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Bump
Am 04.06.2023 um 17:01 schrieb github-actions[bot] @.***>:
 This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.