adapters
adapters copied to clipboard
Better embeddings documentation.
🚀 Feature request
I assume, that feature https://docs.adapterhub.ml/embeddings.html allow to add several new tokens to model and train their embeddings without updates to other embeddings. I ask developers to clarify some points in documentation. -If I understand text correctly, that such function allow to add new embeddings on top of existing ones. But does new embeddings replace initial ones? or substitute only certain tokens? I don't quite understand, how precisely it supposed to work. -Should I add new tokens to tokenizer, which is first provided in .add_embeddings? -How will it work in train run? Will initial part of embeddings be updated? or it's possible to freeze old ones? -Please, provide examples how to use it.
Motivation
It's largely unclear, how to use that function, so barely anyone uses it. I can't find any single example of usage on github, besides forks of adapters.
Your contribution
None for now.
Okay, I need to ask a question. Does any maintainer here understand, how to use that feature?
Hey @arsserpentarium , I have created a notebook illustrating the use of the embeddings functionality. During that, I found a bug with the training addressed in #655 so please install from that until it is merged.
I hope this clarifies the usage of the embeddings and training of embeddings.