OpenLLM
OpenLLM copied to clipboard
bug: Unclear setup
Describe the bug
Running project via README results in issues finding the necessary model. The missing step is something like openllm download dolly-v2
before openllm start dolly-v2
can be run.
To reproduce
- Install per instructions
- Run per instructions
Logs
`NotFound(bentoml.exceptions.NotFound: Model 'pt-databricks-dollyv2-3b:877db3ed12a3086500d144b9ef74e469b107a041' is not found in BentoML store`
Environment
Ubuntu 22.04
Hey there, what is the version of openllm you are using?
Can you also show the output of openllm models --show-available
?
Hello, I am also facing the similar issue, I am using version v0.1.12
I ran command
openllm build flan-t5 --model-id google/flan-t5-large
and it failed with error
bentoml.exceptions.NotFound: Bento 'google-flan-t5-large-service:2d6503cbe79448e511312ba3377a9cde16a2135a' is not found in BentoML store <osfs '/home/.../bentoml/bentos'>
and here is the output of openllm models --show-available
command.
Please help me how to resolve this issue.
Oh sorry, this is a hindsight on my part. I will release a quick patch fix for this
Hey all, Please try out with the latest change, hopefully I have smoothen all rough edges
I have fixed this on main