transformers icon indicating copy to clipboard operation
transformers copied to clipboard

NLP HydraNet

Open sam-h-bean opened this issue 3 years ago • 1 comments

Feature request

I think it would be awesome to be able to easily train a Tesla style HydraNet but using a transformer backbone. The model would take a model_id and a series of tasks. The dataset would need to supply labels for both the task heads and the loss would be controllable via coefficients in the configuration.

Motivation

This would allow people to train multi-task learners using the same neural backbone and co-learn a number of related tasks.

Your contribution

I have some code working for this. Unsure if there is already a pattern for this type of thing or if this is novel enough to contribute back to OSS.

sam-h-bean avatar Jul 07 '22 14:07 sam-h-bean

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

github-actions[bot] avatar Aug 06 '22 15:08 github-actions[bot]