strix
strix copied to clipboard
How to Integrate with a locally model
Strix deployed on Windows, with the large model also deployed locally—how to integrate it?
0.6b is far far far too small and simply wont work at this stage with strix.
I'd highly recommend you use a frontier model like claude sonnet/opus 4.5, chatgpt5
or if you want to try a free model, use https://openrouter.ai/ get an api key and set your model to STRIX_LLM to "openrouter/grok-4.1-fast:free"
further details on model comparision https://artificialanalysis.ai/models/comparisons/grok-4-1-fast-reasoning-vs-qwen3-0.6b-instruct-reasoning