OpenLLM icon indicating copy to clipboard operation
OpenLLM copied to clipboard

How to load a model offline

Open fawpcmhgung162 opened this issue 1 year ago • 1 comments

When I execute openllm start for the first time, the model will be downloaded to the local and then started, but it seems that it will also send a network request to huggingface (although it will not be downloaded again) before it can be started.

I tried setting HF_DATASETS_OFFLINE=1 TRANSFORMERS_OFFLINE=1 but it didn't work

fawpcmhgung162 avatar Jan 25 '24 05:01 fawpcmhgung162

same question here, need a fully local mode

zhangxinyang97 avatar Mar 04 '24 03:03 zhangxinyang97

In 0.5, you can save it to bento model store first with

with bentoml.models.create("your-model") as model:
  ...

then you can start with openllm start your-model.

aarnphm avatar Jun 03 '24 21:06 aarnphm