Perplexica icon indicating copy to clipboard operation
Perplexica copied to clipboard

Feature Request - Multi-Model Service Provider Support

Open arunkumarakvr opened this issue 9 months ago • 2 comments

GitHub Issue: Feature Request - Multi-Model Service Provider Support

Is your feature request related to a problem? Please describe. The current limitation of supporting only a single model service provider in Perplexica restricts users' options and may not fully meet their diverse needs for AI conversation services.

Describe the solution you'd like Implement support for multiple model service providers in Perplexica to offer users a wider range of choices and enhance the platform's adaptability to various user preferences and requirements.

Describe alternatives you've considered One alternative is to stick with a single model service provider, but this may restrict users' options and limit the platform's flexibility. Another alternative is to develop custom integrations for specific model service providers, but this could be time-consuming and resource-intensive.

Additional context Expanding support to additional model service providers, such as Azure OpenAI, AWS Bedrock, Anthropic (Claude), Google Cloud Vertex AI, Google AI Studio, Groq, OpenRouter, and Together.ai, would further enrich Perplexica's service provider library. Additionally, we are requesting support for more providers like Replicate and Perplexity. Community feedback and discussions are encouraged to prioritize support for additional service providers based on user preferences and requirements.

arunkumarakvr avatar Apr 28 '24 08:04 arunkumarakvr

Thanks for the feature request. We're expanding model support, recently adding Ollama, and are exploring more providers like Azure and AWS to better meet user needs. Most of the providers will gradually be added.

ItzCrazyKns avatar Apr 28 '24 08:04 ItzCrazyKns

Great software, thank you for making it! Maybe leveraging Openrouter.ai for instance over other services would be a huge win, as it allows access to almost all models available through API's

denispol avatar May 01 '24 09:05 denispol

Support for Groq and Ollama has been added, additional support for custom OpenAI endpoints has been added that would allow most of the LLM providers.

ItzCrazyKns avatar May 11 '24 16:05 ItzCrazyKns

@ItzCrazyKns none of them are working correctly or properly. I still don't know what you have done to add on all these different providers. And how you marked it as completed.

arunkumarakvr avatar May 11 '24 16:05 arunkumarakvr

@ItzCrazyKns Was wondering if there is a plan to add Anthropic's models' support considering they are SOTA? Thanks!

RazeBerry avatar Jul 10 '24 23:07 RazeBerry