Nguyen Nguyen Anh
Nguyen Nguyen Anh
I am running 8-bits and have the same error. A monkey-patch is not required for 8-bits am I right ? Please advise, thank you. Steve
Exactly why we have to pretrain and finetune again !
You should better off training Alpaca format standard from LLaMA-3 pretrained weight with new LLaMa-3 bos/eos token and it should work.
I am waiting for #1513 break changes to happend to start continual pretrain LlaMA-3 with extended vocab et all. Not sure when this merge will happend (v0.19.2 I guess) as...
> @thusinh1969 What are you finding wrong with 0.19.1? The decoder was buggy for added token when we want to extend vocab for non-English. Being fixed I think. https://github.com/meta-llama/llama3/issues/67 Steve
Gents, When can this fix be ready please ? Thanks, Steve