llama3-from-scratch
llama3-from-scratch copied to clipboard
How to generate <|end_of_text|> ?
I add a for loop for this program, and attach the next token to the end of prompt, it will not generate the <|end_of_text|> , just generate the same context again and again, for example, we use 2x8= as original prompt, but it will generate the sequence like this: ['<|begin_of_text|>', '2', 'x', '9', '=', '18', '\n', '2', 'x', '9', '=', '18', '\n', '2', 'x']