Med-ChatGLM
Med-ChatGLM copied to clipboard
运行报错
home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py:1201: UserWarning: You have modified the pretrained model configuration to control generation. This is a deprecated strategy to control generation and will be removed soon, in a future version. Please use a generation configuration file (see https://huggingface.co/docs/transformers/main_classes/text_generation) warnings.warn( Traceback (most recent call last): File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/infer.py", line 12, in <module> response, history = model.chat(tokenizer, "问题:" + a.strip() + '\n答案:', max_length=256, history=[]) File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/modeling_chatglm.py", line 1114, in chat outputs = self.generate(**input_ids, **gen_kwargs) File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py", line 1452, in generate return self.sample( File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py", line 2465, in sample model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs) File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/modeling_chatglm.py", line 979, in prepare_inputs_for_generation mask_position = seq.index(mask_token) ValueError: 130001 is not in list
我也遇到同样的情况,请问,你解决了么? 运行平台是:macbookpro m1pro
我也遇到了同样的问题
我也遇到了同样的问题 大家都解决了嘛
home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py:1201: UserWarning: You have modified the pretrained model configuration to control generation. This is a deprecated strategy to control generation and will be removed soon, in a future version. Please use a generation configuration file (see https://huggingface.co/docs/transformers/main_classes/text_generation) warnings.warn( Traceback (most recent call last): File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/infer.py", line 12, in
response, history = model.chat(tokenizer, "问题:" + a.strip() + '\n答案:', max_length=256, history=[]) File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/modeling_chatglm.py", line 1114, in chat outputs = self.generate(**input_ids, **gen_kwargs) File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py", line 1452, in generate return self.sample( File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py", line 2465, in sample model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs) File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/modeling_chatglm.py", line 979, in prepare_inputs_for_generation mask_position = seq.index(mask_token) ValueError: 130001 is not in list
看项目的READER.md,里面常见问题
有关于这个的介绍说明
Q: 报错 ValueError: 130001 is not in list / ValueError: 150001 is not in list A: 由于相关依赖更新较快,版本的不同会导致一些bug (1) 如果报错为150001 is not in list,请将仓库更新至最新版本 (2) 如果报错为130001 is not in list,请将仓库回退至commit为cb9d827的版本,链接为https://github.com/SCIR-HI/Med-ChatGLM/tree/cb9d82738021ec6f82b307d6031e8595a49dcb00
回滚到这个仓库,依然报这个错误 ValueError: 130001 is not in list