vllm
vllm copied to clipboard
GPTBigCodeForCasualLM support doesn't work
I tried to use vllm on my finetuned model from starcoder, but its seems not supported from the official package (?) In the README.md is said to be supported.
ValueError Traceback (most recent call last)
[<ipython-input-3-e99dbf2ccfe8>](https://localhost:8080/#) in <cell line: 4>()
2
3 prompts = ["Helloworld in python"] #You can put several prompts in this list
----> 4 llm = LLM(model="Safurai/Safurai-001") # Load the model
5 outputs = llm.generate(prompts) # Trigger inference
5 frames
[/usr/local/lib/python3.10/dist-packages/vllm/model_executor/model_loader.py](https://localhost:8080/#) in _get_model_architecture(config)
25 if arch in _MODEL_REGISTRY:
26 return _MODEL_REGISTRY[arch]
---> 27 raise ValueError(
28 f"Model architectures {architectures} are not supported for now. "
29 f"Supported architectures: {list(_MODEL_REGISTRY.keys())}"
ValueError: Model architectures ['GPTBigCodeForCausalLM'] are not supported for now. Supported architectures: ['GPT2LMHeadModel', 'GPTNeoXForCausalLM', 'LlamaForCausalLM', 'OPTForCausalLM']
Can you try building from source with the latest master commit?
Close this issue due to inactivity. Feel free to re-open if there is any update.
Yes from source it works thanks