aixplora
aixplora copied to clipboard
Add support for Palm, Claude-2, Llama2, CodeLlama (100+LLMs)
This PR adds support for the above mentioned LLMs using LiteLLM https://github.com/BerriAI/litellm/
Example
from litellm import completion
## set ENV variables
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# cohere call
response = completion("command-nightly", messages)
# anthropic call
response = completion(model="claude-instant-1", messages=messages)
The latest updates on your projects. Learn more about Vercel for Git ↗︎
| Name | Status | Preview | Comments | Updated (UTC) |
|---|---|---|---|---|
| aixplora | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | Sep 9, 2023 4:09pm |
addressing: https://github.com/grumpyp/aixplora/issues/128
@grumpyp can you take a look at this pr when possible thanks !
Thanks for the contribution. Would you also implement it from the frontend side? Does this implementation download the LLMs to your machine?
If yes, did you have a look how the other LLMs currently would be stored? So it would make sense to do it in the same way :)
Thx!
- can we address the front end in a separate pr ?
- this does not download any llms to your machine
Why should we seperate it in another PR?
Ok as far as I understand it just uses the LLMs provided by 3rd party API's?
Thats fine, as long as it is compatible with our current implementation. Is there a list of all available LLMs which can be used with litellm? So we could think of how to implement it in a nice way in frontend side.
Or at least add some tests to this PR please.
@grumpyp
re: provider list,
yes here are the docs - https://docs.litellm.ai/docs/providers
via code:
import litellm
print(litellm.provider_list)
@grumpyp
re: provider list,
yes here are the docs - https://docs.litellm.ai/docs/providers
via code:
import litellm print(litellm.provider_list)
Feel free to test everything.
If it doesn't break, I'll be happy to merge it