main
main copied to clipboard
Enhancement: Custom api endpoints
Suggestion: Enhancement
An example of a custom endpoint would be: LLaMA 13B https://api.runpod.ai/v2/yourServer/runsync
It would:
- allow Intellibar to be used in places where the OpenAI API is not available.
- cheaper than openai, possibly by a lot
We've been thinking about this idea ourselves.
I'll be glad if more people join the discussion and share their use cases. That will help us with the details, direction, and importance.
Sure. I'd like to
- point the endpoint at my serverless runpod llama, because it's vastly cheaper than openAI
- point the endpoint to my server running memgpt, loaded with my own reference material to give me an unlimited context window.
Does that sound useful?
Yes, thanks! I hope others join the discussion as well.
+1 Ollama and more external endpoints.
i need openai compatible endpoint to connect my self-hosted models