transformers
transformers copied to clipboard
NLP HydraNet
Feature request
I think it would be awesome to be able to easily train a Tesla style HydraNet but using a transformer backbone. The model would take a model_id and a series of tasks. The dataset would need to supply labels for both the task heads and the loss would be controllable via coefficients in the configuration.
Motivation
This would allow people to train multi-task learners using the same neural backbone and co-learn a number of related tasks.
Your contribution
I have some code working for this. Unsure if there is already a pattern for this type of thing or if this is novel enough to contribute back to OSS.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.