feature: custom vendors for RAG embedding API
Feature request
My company proxies calls to OpenAI but maintains the same API interface. I'd be able to use the RAG feature if I could create a custom adapter that points at my company's endpoint rather than defaulting to OpenAI.
Motivation
The ability to use RAG features in environments where users are limited to go through proxies.
Other
No response
~~use env OPENAI_BASE_URL~~
use env OPENAI_API_BASE
#1400
Great, thank you.
It seems that OPENAI_API_KEY works, but OPENAI_API_BASE doesn't get passed through to the docker container.
The following are also not being passed to the docker container:
RAG_LLM_MODEL RAG_EMBED_MODEL ANONYMIZED_TELEMETRY
Ah, I see that it's now part of the rag_service config. Is there a way to pass ANONYMIZED_TELEMETRY though?