phospho
phospho copied to clipboard
Select different LLM provider from env variables
You should be able to change the LLM provider without changing the code, just by specifying the model, base_url and api_key. Should work with any openai client compatible server (Mistral, Ollama, vLLM, together AI...)
Right now, you need to change how the OpenAI client is initialized (in the pipeline, not documented yet).