chatgpt-google-summary-extension
chatgpt-google-summary-extension copied to clipboard
When will the Glarity plugin support access to claude's api?
When will the Glarity plugin support access to claude's api?
+1
@Felo-Sparticle Hi, is Claude support on roadmap?
+1
Hi @haoqwenie @saccohuo @Oshibuki @givebest I believe I can help with this issue. I’m the maintainer of LiteLLM https://github.com/BerriAI/litellm - we allow you to use any LLM as a drop in replacement for gpt-3.5-turbo
.
TLDR with LiteLLM:
- Use any LLM as a drop in replacement for gpt-3.5-turbo
- If you dont have access to the LLM - use the LiteLLM proxy to access the LLM
You can use LiteLLM in the following ways:
With your own API KEY:
This calls the provider API directly
from litellm import completion
import os
## set ENV variables
os.environ["OPENAI_API_KEY"] = "your-key" #
os.environ["COHERE_API_KEY"] = "your-key" #
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# cohere call
response = completion(model="command-nightly", messages=messages)
Using the LiteLLM Proxy with a LiteLLM Key
this is great if you don’t have access to claude but want to use the open source LiteLLM proxy to access claude
from litellm import completion
import os
## set ENV variables
os.environ["OPENAI_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your openai key
os.environ["COHERE_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your cohere key
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# cohere call
response = completion(model="command-nightly", messages=messages)