gpt-pilot icon indicating copy to clipboard operation
gpt-pilot copied to clipboard

Add support for Azure, llama2, palm, claude2, cohere command nightly, HF, Replicate (100+ LLMS)

Open ishaan-jaff opened this issue 2 years ago • 4 comments

This PR adds support for models from all the above mentioned providers using https://github.com/BerriAI/litellm/

Here's a sample of how it's used:

from litellm import completion, acompletion

## set ENV variables
# ENV variables can be set in .env file, too. Example in .env.example
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# llama2 call
model_name = "replicate/llama-2-70b-chat:2c1608e18606fad2812020dc541930f2d0495ce32eee50074220b87300bc16e1"
response = completion(model_name, messages)

# cohere call
response = completion("command-nightly", messages)

# anthropic call
response = completion(model="claude-instant-1", messages=messages)

ishaan-jaff avatar Sep 05 '23 00:09 ishaan-jaff

@zvone187 @LeonOstrez can you please take a look at this PR when possible?

Happy to add docs/tests too if this initial commit looks good😊

ishaan-jaff avatar Sep 05 '23 01:09 ishaan-jaff

i wonder if there may be misconfiguration so it uses more than one api token at a time? have you tested your token usages across all models with this snippet youself? did you notice improvement when you used more models at once?

paradiselabs-ai avatar Sep 19 '23 22:09 paradiselabs-ai

yes we have a live proxy api that can handle all LLMs: https://docs.litellm.ai/docs/proxy_api

  • we use LiteLLM in production ourselves

ishaan-jaff avatar Sep 19 '23 22:09 ishaan-jaff

any update on this @nalbion ? how can we be helpful to gpt-pilot. Noticed this issue mentioning moving to litellm: https://github.com/Pythagora-io/gpt-pilot/issues/132

Happy to hop on a call and discuss the integration if you'd like too

ishaan-jaff avatar Oct 16 '23 23:10 ishaan-jaff

GPT Pilot can now use LIteLLM through its proxy, see here https://github.com/Pythagora-io/gpt-pilot/wiki/Using-GPT%E2%80%90Pilot-with-Local-LLMs#litellm

senko avatar Jan 21 '24 00:01 senko