litellm icon indicating copy to clipboard operation
litellm copied to clipboard

feat: add aphrodite support

Open AlpinDale opened this issue 2 years ago • 6 comments

This PR adds support for Aphrodite Engine. WIP as this is currently untested.

AlpinDale avatar Dec 15 '23 15:12 AlpinDale

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ❌ Failed (Inspect) Dec 15, 2023 3:42pm

vercel[bot] avatar Dec 15 '23 15:12 vercel[bot]

@AlpinDale isn't it already openai-compatible? meaning you could just call it like this - https://docs.litellm.ai/docs/providers/openai_compatible Screenshot 2023-12-15 at 9 41 18 AM

krrishdholakia avatar Dec 15 '23 17:12 krrishdholakia

@AlpinDale bump on this?

krrishdholakia avatar Dec 16 '23 20:12 krrishdholakia

@krrishdholakia hi sorry for the late reply.

I'd assume the LiteLLM OpenAI endpoint doesn't support any samplers beyond what OpenAI itself provides. Is that true? If not, I suppose we can use it as-is.

AlpinDale avatar Dec 17 '23 08:12 AlpinDale

Yea - we send across any unmapped kwargs straight to the provider - https://docs.litellm.ai/docs/completion/input#provider-specific-params.

You can test this out by doing

import os 
from litellm import completion

os.environ["OPENAI_API_KEY"] = "your-api-key"

# openai call
response = completion(
    model = "openai/<your-model-name>", 
    messages=[{ "content": "Hello, how are you?","role": "user"}],
    api_base = "your-api-base"
)

Let me know if this solves your problem!

krrishdholakia avatar Dec 23 '23 06:12 krrishdholakia

I don't mind adding something more specific for aphrodite, but if you have no hosted endpoint - i won't be able to add this to our ci/cd pipeline for testing.

krrishdholakia avatar Dec 23 '23 06:12 krrishdholakia