hacker-news-digest icon indicating copy to clipboard operation
hacker-news-digest copied to clipboard

adding support for anthropic, azure, cohere, llama2

Open krrishdholakia opened this issue 2 years ago • 7 comments

Hi @polyrabbit ,

Noticed you're only calling OpenAI. I'm working on litellm (simple library to standardize LLM API Calls - https://github.com/BerriAI/litellm) and was wondering if we could be helpful.

Added support for Claude, Cohere, Azure and Llama2 (via Replicate) by replacing the ChatOpenAI completion call with a litellm completion call. The code is pretty similar to the OpenAI class - as litellm follows the same pattern as the openai-python sdk.

Would love to know if this helps.

Happy to add additional tests / update documentation, if the initial PR looks good to you.

krrishdholakia avatar Aug 08 '23 18:08 krrishdholakia

Hi, thanks for this wonderful library.

Just one quick question - does it support function calling for other models, or even just OpanAI models? This app relies on JSON response.

polyrabbit avatar Aug 09 '23 03:08 polyrabbit

yes it support function calling - exactly like how openai calls it - https://litellm.readthedocs.io/en/latest/input/

krrishdholakia avatar Aug 09 '23 03:08 krrishdholakia

Nice! I'll try it later, thanks

polyrabbit avatar Aug 09 '23 03:08 polyrabbit

One difference I found is on the way to set timeout - OpenAI uses timeout parameter whereas litellm uses force_timeout, is it intended?

Could you please also add litellm as a dependency to the requirements.txt file?

polyrabbit avatar Aug 09 '23 16:08 polyrabbit

Hey @polyrabbit i updated the requirements.txt.

re:timeout - i thought that was for the completions endpoint - i don't recall seeing a timeout parameter for ChatCompletions - if you could share any relevant documentation, happy to check it out.

Let me know if there are any remaining blockers for this PR

krrishdholakia avatar Aug 11 '23 21:08 krrishdholakia

I see it here: https://github.com/openai/openai-python/blob/b82a3f7e4c462a8a10fa445193301a3cefef9a4a/openai/api_resources/chat_completion.py#L21-L28

def create(cls, *args, **kwargs):
    """
    Creates a new chat completion for the provided messages and parameters.

    See https://platform.openai.com/docs/api-reference/chat-completions/create
    for a list of valid parameters.
    """
    start = time.time()
    timeout = kwargs.pop("timeout", None)

So timeout is used in my code, after switching to litellm, the code throws exception: unexpected keyword argument 'timeout'

polyrabbit avatar Aug 12 '23 15:08 polyrabbit

got it - will make a fix for it and update the PR

krrishdholakia avatar Aug 12 '23 17:08 krrishdholakia