torchscale icon indicating copy to clipboard operation
torchscale copied to clipboard

embed_tokens

Open CodeMiningCZW opened this issue 10 months ago • 4 comments

In the RetNet model, embed _ tokens is not given, I can 't run the code. When I use this model, what should the parameter token _ embeddings pass ? Or how do I define embed _ tokens ?

CodeMiningCZW avatar Aug 16 '23 01:08 CodeMiningCZW

I found one blog (in Japanese) that might be useful https://zenn.dev/selllous/articles/retnet_tutorial.

donglixp avatar Aug 16 '23 03:08 donglixp

A simple nn.Embeddng(vocab_size, embedding_size) will work. Or you can refer to our example on language modeling.

shumingma avatar Aug 16 '23 03:08 shumingma

I also encountered this problem. When I want to use the encoder and decoder modules separately, the code will report an error, I also want to know where the problem is and how to solve it

egoistor avatar Sep 01 '23 08:09 egoistor

A simple nn.Embeddng(vocab_size, embedding_size) will work. Or you can refer to our example on language modeling.

from fairseq.models.transformer import DEFAULT_MIN_PARAMS_TO_WRAP, Embedding

I can't find the transformer.

DaZhUUU avatar Oct 31 '23 12:10 DaZhUUU