gpt_academic
gpt_academic copied to clipboard
请教一下,chatglm本地部署时 权重文件夹下载好并放在工程文件夹里,然后在bridge_chatglm.py里面改了路径,可是总是不能导入权重,bug如下,
Explicitly passing a revision
is encouraged when loading a model with custom code to ensure no malicious code has been contributed
in a newer revision.
Explicitly passing a revision
is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision.
Explicitly passing a revision
is encouraged when loading a model with custom code to ensure no malicious code has been contributed
in a newer revision.
Explicitly passing a revision
is encouraged when loading a model with custom code to ensure no malicious code has been contributed
in a newer revision.
Explicitly passing a revision
is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision.
Explicitly passing a revision
is encouraged when loading a model with custom code to ensure no malicious code has been contributed
in a newer revision.
Explicitly passing a revision
is encouraged when loading a model with custom code to ensure no malicious code has been contributed
in a newer revision.
Explicitly passing a revision
is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision.
Explicitly passing a revision
is encouraged when loading a model with custom code to ensure no malicious code has been contributed
in a newer revision.
Explicitly passing a revision
is encouraged when loading a model with custom code to ensure no malicious code has been contributed
in a newer revision.
Explicitly passing a revision
is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision.
Explicitly passing a revision
is encouraged when loading a model with custom code to ensure no malicious code has been contributed
in a newer revision.
Process GetGLMHandle-1:
Traceback (most recent call last):
File "e:\tensorflow-learning\github\chatgpt_academic-master\request_llm\bridge_chatglm.py", line 45, in run
self.chatglm_model = AutoModel.from_pretrained(model_path, trust_remote_code=True).float()
File "E:\python-practice\python_envs\Python38_novelAI\lib\site-packages\transformers\models\auto\auto_factory.py", line 440, in from_pretrained
model_class = get_class_from_dynamic_module(
File "E:\python-practice\python_envs\Python38_novelAI\lib\site-packages\transformers\dynamic_module_utils.py", line 384, in get_class_from_dynamic_module
return get_class_in_module(class_name, final_module.replace(".py", ""))
File "E:\python-practice\python_envs\Python38_novelAI\lib\site-packages\transformers\dynamic_module_utils.py", line 154, in get_class_in_module
module = importlib.import_module(module_path)
File "E:\python-practice\python_envs\Python38_novelAI\lib\importlib_init_.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "
from transformers.generation.logits_process import LogitsProcessor
ModuleNotFoundError: No module named 'transformers.generation'
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "E:\python-practice\python_envs\Python38_novelAI\lib\multiprocessing\process.py", line 315, in _bootstrap self.run() File "e:\tensorflow-learning\github\chatgpt_academic-master\request_llm\bridge_chatglm.py", line 56, in run raise RuntimeError("不能正常加载ChatGLM的参数!") RuntimeError: 不能正常加载ChatGLM的参数!``