aixplora icon indicating copy to clipboard operation
aixplora copied to clipboard

Add support for Palm, Claude-2, Llama2, CodeLlama (100+LLMs)

Open ishaan-jaff opened this issue 11 months ago • 8 comments

This PR adds support for the above mentioned LLMs using LiteLLM https://github.com/BerriAI/litellm/

Example

from litellm import completion

## set ENV variables
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion("command-nightly", messages)

# anthropic call
response = completion(model="claude-instant-1", messages=messages)

ishaan-jaff avatar Sep 09 '23 16:09 ishaan-jaff

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
aixplora ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 9, 2023 4:09pm

vercel[bot] avatar Sep 09 '23 16:09 vercel[bot]

addressing: https://github.com/grumpyp/aixplora/issues/128

ishaan-jaff avatar Sep 09 '23 16:09 ishaan-jaff

@grumpyp can you take a look at this pr when possible thanks !

ishaan-jaff avatar Sep 09 '23 16:09 ishaan-jaff

Thanks for the contribution. Would you also implement it from the frontend side? Does this implementation download the LLMs to your machine?

If yes, did you have a look how the other LLMs currently would be stored? So it would make sense to do it in the same way :)

Thx!

grumpyp avatar Sep 09 '23 19:09 grumpyp

  • can we address the front end in a separate pr ?
  • this does not download any llms to your machine

ishaan-jaff avatar Sep 11 '23 14:09 ishaan-jaff

Why should we seperate it in another PR?

Ok as far as I understand it just uses the LLMs provided by 3rd party API's?

Thats fine, as long as it is compatible with our current implementation. Is there a list of all available LLMs which can be used with litellm? So we could think of how to implement it in a nice way in frontend side.

Or at least add some tests to this PR please.

grumpyp avatar Sep 11 '23 21:09 grumpyp

@grumpyp

re: provider list,

yes here are the docs - https://docs.litellm.ai/docs/providers

via code:

import litellm

print(litellm.provider_list)

krrishdholakia avatar Oct 04 '23 20:10 krrishdholakia

@grumpyp

re: provider list,

yes here are the docs - https://docs.litellm.ai/docs/providers

via code:

import litellm

print(litellm.provider_list)

Feel free to test everything.

If it doesn't break, I'll be happy to merge it

grumpyp avatar Oct 05 '23 03:10 grumpyp