vllm icon indicating copy to clipboard operation
vllm copied to clipboard

ChatGLM3-6B AttributeError: 'ChatGLMConfig' object has no attribute 'num_hidden_layers'

Open Senna1960321 opened this issue 2 years ago • 4 comments

Traceback (most recent call last): File "vcheck.py", line 2014, in llm = LLM(model=model_path, trust_remote_code=True) # Name or path of your model File "/app/vllm/vllm/entrypoints/llm.py", line 66, in init self.llm_engine = LLMEngine.from_engine_args(engine_args) File "/app/vllm/vllm/engine/llm_engine.py", line 157, in from_engine_args engine = cls(*engine_configs, File "/app/vllm/vllm/engine/llm_engine.py", line 79, in init self._verify_args() File "/app/vllm/vllm/engine/llm_engine.py", line 114, in _verify_args self.model_config.verify_with_parallel_config(self.parallel_config) File "/app/vllm/vllm/config.py", line 81, in verify_with_parallel_config total_num_hidden_layers = self.hf_config.num_hidden_layers File "/usr/local/lib/python3.8/dist-packages/transformers/configuration_utils.py", line 265, in getattribute return super().getattribute(key) AttributeError: 'ChatGLMConfig' object has no attribute 'num_hidden_layers' my vllm version is 0.2.6

Senna1960321 avatar Dec 27 '23 09:12 Senna1960321

Hi @Senna1960321, could you check the model_type attribute of your model's config.json? It should be "chatglm".

WoosukKwon avatar Jan 03 '24 06:01 WoosukKwon

Hi @Senna1960321, could you check the model_type attribute of your model's config.json? It should be "chatglm".

I check the config.json, it's value is "model_type": "chatglm". @WoosukKwon

Senna1960321 avatar Jan 04 '24 10:01 Senna1960321

Hi, @Senna1960321 I met the same error with version 0.2.6, have you solved the issue?

BaileyWei avatar Jan 08 '24 06:01 BaileyWei

Hi, @Senna1960321 I met the same error with version 0.2.6, have you solved the issue?

No, I can't use ChatGLM3-6B yet, but I can use Llama2-7b. @BaileyWei

Senna1960321 avatar Jan 08 '24 07:01 Senna1960321