vllm icon indicating copy to clipboard operation
vllm copied to clipboard

GPTBigCodeForCasualLM support doesn't work

Open davide221 opened this issue 1 year ago • 1 comments

I tried to use vllm on my finetuned model from starcoder, but its seems not supported from the official package (?) In the README.md is said to be supported.

ValueError                                Traceback (most recent call last)
[<ipython-input-3-e99dbf2ccfe8>](https://localhost:8080/#) in <cell line: 4>()
      2 
      3 prompts = ["Helloworld in python"] #You can put several prompts in this list
----> 4 llm = LLM(model="Safurai/Safurai-001")  # Load the model
      5 outputs = llm.generate(prompts)  # Trigger inference

5 frames
[/usr/local/lib/python3.10/dist-packages/vllm/model_executor/model_loader.py](https://localhost:8080/#) in _get_model_architecture(config)
     25         if arch in _MODEL_REGISTRY:
     26             return _MODEL_REGISTRY[arch]
---> 27     raise ValueError(
     28         f"Model architectures {architectures} are not supported for now. "
     29         f"Supported architectures: {list(_MODEL_REGISTRY.keys())}"

ValueError: Model architectures ['GPTBigCodeForCausalLM'] are not supported for now. Supported architectures: ['GPT2LMHeadModel', 'GPTNeoXForCausalLM', 'LlamaForCausalLM', 'OPTForCausalLM']

davide221 avatar Jun 30 '23 15:06 davide221

Can you try building from source with the latest master commit?

zhuohan123 avatar Jun 30 '23 19:06 zhuohan123

Close this issue due to inactivity. Feel free to re-open if there is any update.

zhuohan123 avatar Jul 07 '23 05:07 zhuohan123

Yes from source it works thanks

davide221 avatar Jul 18 '23 10:07 davide221