adapters
adapters copied to clipboard
Add Support for Electra
Support architecture proposed in Electra: Pre-training text encoders as discriminators rather than generators
Hey @amitkumarj441, thanks a lot for working on this! While there are still a few tests failing currently, I did a partial review of your changes and left some comments. All in all, it looks very good.
Besides fixing the missing tests, please have look at our contribution guide for the required documentation steps for a new model. Let me know if anything is unclear or you need any assistance from our side!
Thanks @calpt for reviewing this PR. I will make changes as suggested soon.
Any update soon?
Hey, thanks for your work on this. We have been working on developing a new adapters
version of the library which is decoupled from the transformers
library (see #584 for details). We want to add Electra support to the adapters
library and started implementing the support based on this PR in #583.
Closing in favor of #583.