Fengshenbang-LM icon indicating copy to clipboard operation
Fengshenbang-LM copied to clipboard

LlamaTokenizer相关问题

Open xiaojidaner opened this issue 1 year ago • 5 comments

您好,13B遇到的问题:ValueError: Tokenizer class LlamaTokenizer does not exist or is not currently imported.

xiaojidaner avatar Jun 09 '23 09:06 xiaojidaner

transformers的版本可能不对,升级到最新版试试看

suolyer avatar Jun 11 '23 14:06 suolyer

想问下你们的transformers是啥?可以导入LlamaTokenizer的话。4.30.2版本导入的是transformers.models.llama.LlamaTokenizerFast,并没有LlamaTokenizer @suolyer @xiaojidaner

Windy-Ground avatar Jul 04 '23 11:07 Windy-Ground

想问下你们的transformers是啥?可以导入LlamaTokenizer的话。4.30.2版本导入的是transformers.models.llama.LlamaTokenizerFast,并没有LlamaTokenizer @suolyer @xiaojidaner

我是升级到4.28+的transformers 然后把 from transformers.models.llama import LlamaForCausalLM, LlamaTokenizer, LlamaConfig 改成 from transformers import LlamaForCausalLM, LlamaTokenizer, LlamaConfig 但我现在报ModuleNotFoundError: No module named 'flash_attn_cuda' 不知道是啥问题 @Windy-Ground

karlshoo avatar Jul 17 '23 08:07 karlshoo

想问下你们的transformers是啥?可以导入LlamaTokenizer的话。4.30.2版本导入的是transformers.models.llama.LlamaTokenizerFast,并没有LlamaTokenizer @suolyer @xiaojidaner

4.28 应该是可以的。

Desein-Yang avatar Sep 04 '23 07:09 Desein-Yang

想问下你们的transformers是啥?可以导入LlamaTokenizer的话。4.30.2版本导入的是transformers.models.llama.LlamaTokenizerFast,并没有LlamaTokenizer @suolyer @xiaojidaner

我是升级到4.28+的transformers 然后把 from transformers.models.llama import LlamaForCausalLM, LlamaTokenizer, LlamaConfig 改成 from transformers import LlamaForCausalLM, LlamaTokenizer, LlamaConfig 但我现在报ModuleNotFoundError: No module named 'flash_attn_cuda' 不知道是啥问题 @Windy-Ground

flash attn cuda 是为了加速引入了 flash attention,不开flash attn 的话不是必须的,请参考 https://github.com/Dao-AILab/flash-attention

Desein-Yang avatar Sep 04 '23 08:09 Desein-Yang