adapters
adapters copied to clipboard
Adding adapter support for NeoX
Added adapter support for GPTNeoX with tests. Although at the moment, while training the adapter module, it also trains the CLM head (which is not the expected situation). This I already raised here.
@calpt , sorry to bother you. did you by chance check this?
Hey, thanks again for your efforts in contributing new model architectures to adapter-transformers and sorry for the silence on our side.
In the last few weeks, we've been working on a large refactoring of our project, which will ultimately result in the release of Adapters, the next-generation adapters library. See https://github.com/adapter-hub/adapter-transformers/issues/584.
As a consequence, we plan to merge any new model integrations directly to the new codebase, which currently can be found on this branch. Unfortunately, this necessitates some changes in the model integration code (detailed here, see already integrated models such as BERT, BART etc. for reference).
If you'd be willing to update your model integration to target the new library yourself, we'd be super happy to help you on this. Otherwise, we might look into upgrading and merging some of the open model integration PRs ourselves in the future. For more details, again see https://github.com/adapter-hub/adapter-transformers/issues/584.