MetaGPT
MetaGPT copied to clipboard
Add support for Llama 2, Palm, Anthropic, Cohere Models - using litellm
Addressing https://github.com/geekan/MetaGPT/issues/97
I'm the maintainer of litellm https://github.com/BerriAI/litellm - a simple & light package to call OpenAI, Azure, Cohere, Anthropic, Replicate API Endpoints
This PR adds support for models from all the above mentioned providers (by creating a class liteLLM)
Here's a sample of how it's used:
from litellm import completion
## set ENV variables
# ENV variables can be set in .env file, too. Example in .env.example
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# cohere call
response = completion("command-nightly", messages)
# anthropic call
response = completion(model="claude-instant-1", messages=messages)
cc @geekan can I get a review on this ?
Thanks! Can we make llama2 a local file and call it? So everything is local without the need to call external APIs.
any update here? @ishaan-jaff
Thanks for bumping @stellaHSR taking a look
Hi @ishaan-jaff, I suggest implementing acompletion_text() to make using liteLLM in metagpt easier.
will love to be able to use local models especially llama2 70B
Any updates on this? @ishaan-jaff