strix
strix copied to clipboard
docs: Add complete setup guide for common LLM providers
We need to add a comprehensive setup guide that covers how to configure Strix with the most popular LLM providers.
The doc should include:
- OpenAI (GPT-5.1, codex, etc.)
- Anthropic (Claude Sonnet / Opus)
- Google (Gemini)
- Azure OpenAI
- DeepSeek / Groq
- OpenRouter
- Ollama
- LMStudio
- etc..
Each provider section should include:
- Required API keys / environment variables
- Example
litellmconfig - Recommended model(s) for best Strix performance
- Common pitfalls + troubleshooting
wip : https://github.com/TWN-Systems/strix-docs will submit a PR later tonight
I myself have been using AWS Bedrock for Cloud provider with Anthropic models. I noticed it wasn't there ? There are some issues but overall it still seems to be able to work things out. However I noticed I have to set whatever value to LLM_API_KEY as it's mandatory by strix though actually not needed as I have an authenticated environment to run strix.