llama icon indicating copy to clipboard operation
llama copied to clipboard

what is the context size/context window of LLaMA?

Open sauravtii opened this issue 1 year ago • 2 comments

What is the maximum token limit of llama? Is it 1024, 2048, 4096, or longer?

How much can it handle during the inference?

I did find similar issues but no one has really answered the question, so I would appreciate any help I can get.

sauravtii avatar Apr 14 '23 06:04 sauravtii

https://github.com/facebookresearch/llama/commit/a81fb4e211d6656854a5dd24cf8631dc319234d1

2048

AlyoshaVasilieva avatar Apr 14 '23 12:04 AlyoshaVasilieva

Thanks @AlyoshaVasilieva, is it the same for all models (7B, 13B, 33B, 65B)?

sauravtii avatar Apr 14 '23 12:04 sauravtii

@sauravtii - did you find the answer to this question? Also how to find the context window of various new open source models?

dhirajsuvarna avatar Jun 21 '23 07:06 dhirajsuvarna

Same for Llama 2? (I can't seem to find an answer)

freckletonj avatar Jul 31 '23 20:07 freckletonj

Same for Llama 2? (I can't seem to find an answer)

llama: 2048 tokens llama2: 4096 tokens

gaba42 avatar Aug 01 '23 01:08 gaba42

Thanks all for the help in responding here and I hope your questions were answered @sauravtii. You can also get this information via the model card for Llama 2 - https://github.com/facebookresearch/llama/blob/main/MODEL_CARD.md and the website https://ai.meta.com/llama/

ejsd1989 avatar Sep 06 '23 18:09 ejsd1989