strix icon indicating copy to clipboard operation
strix copied to clipboard

docs: Add complete setup guide for common LLM providers

Open 0xallam opened this issue 1 month ago • 2 comments

We need to add a comprehensive setup guide that covers how to configure Strix with the most popular LLM providers.

The doc should include:

  • OpenAI (GPT-5.1, codex, etc.)
  • Anthropic (Claude Sonnet / Opus)
  • Google (Gemini)
  • Azure OpenAI
  • DeepSeek / Groq
  • OpenRouter
  • Ollama
  • LMStudio
  • etc..

Each provider section should include:

  • Required API keys / environment variables
  • Example litellm config
  • Recommended model(s) for best Strix performance
  • Common pitfalls + troubleshooting

0xallam avatar Nov 27 '25 17:11 0xallam

wip : https://github.com/TWN-Systems/strix-docs will submit a PR later tonight

yokoszn avatar Nov 28 '25 04:11 yokoszn

I myself have been using AWS Bedrock for Cloud provider with Anthropic models. I noticed it wasn't there ? There are some issues but overall it still seems to be able to work things out. However I noticed I have to set whatever value to LLM_API_KEY as it's mandatory by strix though actually not needed as I have an authenticated environment to run strix.

amille44420 avatar Dec 02 '25 09:12 amille44420