feat: add aphrodite support
The latest updates on your projects. Learn more about Vercel for Git ↗︎
| Name | Status | Preview | Comments | Updated (UTC) |
|---|---|---|---|---|
| litellm | ❌ Failed (Inspect) | Dec 15, 2023 3:42pm |
@AlpinDale isn't it already openai-compatible? meaning you could just call it like this - https://docs.litellm.ai/docs/providers/openai_compatible
@AlpinDale bump on this?
@krrishdholakia hi sorry for the late reply.
I'd assume the LiteLLM OpenAI endpoint doesn't support any samplers beyond what OpenAI itself provides. Is that true? If not, I suppose we can use it as-is.
Yea - we send across any unmapped kwargs straight to the provider - https://docs.litellm.ai/docs/completion/input#provider-specific-params.
You can test this out by doing
import os
from litellm import completion
os.environ["OPENAI_API_KEY"] = "your-api-key"
# openai call
response = completion(
model = "openai/<your-model-name>",
messages=[{ "content": "Hello, how are you?","role": "user"}],
api_base = "your-api-base"
)
Let me know if this solves your problem!
I don't mind adding something more specific for aphrodite, but if you have no hosted endpoint - i won't be able to add this to our ci/cd pipeline for testing.