System Info / 系統信息
conda python=3.11
Who can help? / 谁可以帮助到您?
No response
Information / 问题信息
- [X] The official example scripts / 官方的示例脚本
- [ ] My own modified scripts / 我自己修改的脚本和任务
Reproduction / 复现过程
INFO:sat:[RANK 0] > initializing model parallel with size 2
[Language processor version]: chat
[rank1]: Traceback (most recent call last):
[rank1]: File "/home/tt/CogVLM/basic_demo/cli_demo_sat.py", line 161, in
[rank1]: main()
[rank1]: File "/home/tt/CogVLM/basic_demo/cli_demo_sat.py", line 56, in main
[rank1]: tokenizer = llama2_tokenizer(args.local_tokenizer, signal_type=language_processor_version)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/home/tt/CogVLM/utils/utils/language.py", line 37, in llama2_tokenizer
[rank1]: tokenizer = LlamaTokenizer.from_pretrained(tokenizer_path)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/home/tt/anaconda3/envs/coga/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 2163, in from_pretrained
[rank1]: return cls._from_pretrained(
[rank1]: ^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/home/tt/anaconda3/envs/coga/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 2397, in _from_pretrained
[rank1]: tokenizer = cls(*init_inputs, **init_kwargs)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/home/tt/anaconda3/envs/coga/lib/python3.11/site-packages/transformers/models/llama/tokenization_llama.py", line 171, in init
[rank1]: self.sp_model = self.get_spm_processor(kwargs.pop("from_slow", False))
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/home/tt/anaconda3/envs/coga/lib/python3.11/site-packages/transformers/models/llama/tokenization_llama.py", line 204, in get_spm_processor
[rank1]: model = model_pb2.ModelProto.FromString(sp_model)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: google.protobuf.message.DecodeError: Error parsing message
Expected behavior / 期待表现
我是在modelscope上下载的vicuna-7b-v1.5