adapters
adapters copied to clipboard
[T-DNA] New adapter for easy domain adaptation
🌟 New adapter setup
Model description
The ACL 2021 paper Taming Pre-trained Language Models with N-gram Representations for Low-Resource Domain Adaptation introduces a Transformer-based Domainaware N-gram Adaptor, T-DNA, to effectively learn and incorporate the semantic representation of different combinations of words in the new domain. It demonstrates that incorporating domain-specific n-grams with the proposed adapter is an effective and efficient solution to domain adaptation, showing that the information carried by larger text granularity is highly important for language processing across domains.
Open source status
- [x] the model implementation is available: https://github.com/shizhediao/T-DNA
- [x] the model weights are available: not available for now but will be released
- [x] who are the authors: (Shizhe Diao (@shizhediao), Ruijia Xu, Hongjin Su, Yilei Jiang, Yan Song, Tong Zhang)