llama
llama copied to clipboard
Embedding shape / Vocab size
Hello to all, Thank you for this work.
I guess anyone who had access to the model weights as well as the authors can answer my question. I may have missed it in the paper but it seems to me that there is no mention of the embedding shape or just the tokenizer vocabulary size.
I have the same question here. I didn't access the weights yet. so I am asking if any one has.
The vocab size is 32000.