Soocheol Noh
Results
1
comments of
Soocheol Noh
The UMT5 model features a unique relative attention bias for each self-attention layer. Therefore, the corresponding converter code in [the file](https://github.com/OpenNMT/CTranslate2/blob/2203ad5c8baf878a2d08e73095421e2ba033c89c/python/ctranslate2/converters/transformers.py) can be written as follows: ```python @register_loader("UMT5Config") class UMT5Loader(T5Loader):...