BotSharp icon indicating copy to clipboard operation
BotSharp copied to clipboard

Error Fetching Models

Open Webslug opened this issue 2 years ago • 3 comments

Hi, I managed to build bot sharp and got the UI running. Can this load many of the hugging face models? I was hoping I could run local models off of my system. I seem to get this error and I could not figure out how to change the path to models. I don't want to use OpenAI.

efm

Webslug avatar Aug 24 '23 08:08 Webslug

I think the most completeness of LLM backend is Azure OpenAI.

hchen2020 avatar Aug 24 '23 16:08 hchen2020

So it's a complete waste of time to use this, unless I have Azure OpenAI?

Webslug avatar Aug 31 '23 14:08 Webslug

Hi @Webslug @hchen2020 I’m the maintainer of LiteLLM (abstraction to call 100+ LLMs)- we allow you to create a proxy server to call 100+ LLMs, and I think it can solve your problem (I'd love your feedback if it does not)

Try it here: https://docs.litellm.ai/docs/proxy_server https://github.com/BerriAI/litellm

Using LiteLLM Proxy Server

import openai
openai.api_base = "http://0.0.0.0:8000/" # proxy url
print(openai.ChatCompletion.create(model="test", messages=[{"role":"user", "content":"Hey!"}]))

Creating a proxy server

Ollama models

$ litellm --model ollama/llama2 --api_base http://localhost:11434

Hugging Face Models

$ export HUGGINGFACE_API_KEY=my-api-key #[OPTIONAL]
$ litellm --model claude-instant-1

Anthropic

$ export ANTHROPIC_API_KEY=my-api-key
$ litellm --model claude-instant-1

Palm

$ export PALM_API_KEY=my-palm-key
$ litellm --model palm/chat-bison

ishaan-jaff avatar Sep 29 '23 04:09 ishaan-jaff

Sorry I gave up on this as this was getting too confusing and I just recreated my own console discord AI bot in the end.

Thanks for the help and for the project.

Webslug avatar May 11 '24 07:05 Webslug