Traceback (most recent call last):
File "vcheck.py", line 2014, in
llm = LLM(model=model_path, trust_remote_code=True) # Name or path of your model
File "/app/vllm/vllm/entrypoints/llm.py", line 66, in init
self.llm_engine = LLMEngine.from_engine_args(engine_args)
File "/app/vllm/vllm/engine/llm_engine.py", line 157, in from_engine_args
engine = cls(*engine_configs,
File "/app/vllm/vllm/engine/llm_engine.py", line 79, in init
self._verify_args()
File "/app/vllm/vllm/engine/llm_engine.py", line 114, in _verify_args
self.model_config.verify_with_parallel_config(self.parallel_config)
File "/app/vllm/vllm/config.py", line 81, in verify_with_parallel_config
total_num_hidden_layers = self.hf_config.num_hidden_layers
File "/usr/local/lib/python3.8/dist-packages/transformers/configuration_utils.py", line 265, in getattribute
return super().getattribute(key)
AttributeError: 'ChatGLMConfig' object has no attribute 'num_hidden_layers'
my vllm version is 0.2.6
Hi @Senna1960321, could you check the model_type attribute of your model's config.json? It should be "chatglm".
Hi @Senna1960321, could you check the model_type attribute of your model's config.json? It should be "chatglm".
I check the config.json, it's value is "model_type": "chatglm". @WoosukKwon
Hi, @Senna1960321 I met the same error with version 0.2.6, have you solved the issue?
Hi, @Senna1960321 I met the same error with version 0.2.6, have you solved the issue?
No, I can't use ChatGLM3-6B yet, but I can use Llama2-7b. @BaileyWei