LLocalSearch
LLocalSearch copied to clipboard
Vllm support (open-ai compatible api)
Is your feature request related to a problem? Please describe. No.
Describe the solution you'd like Ability to use a self-hosted open-ai api endpoint to speed up results via Vllm
Describe alternatives you've considered N/A
Additional context While Ollama is a nice way to get started with LLMs, I think for the more experienced users can benefit adding support for a self hosted api endpoints like the on Vllm uses.