strix icon indicating copy to clipboard operation
strix copied to clipboard

How to Integrate with a locally model

Open Rewinner99 opened this issue 1 month ago • 1 comments

Strix deployed on Windows, with the large model also deployed locally—how to integrate it?

Image

Rewinner99 avatar Nov 27 '25 16:11 Rewinner99

0.6b is far far far too small and simply wont work at this stage with strix.

I'd highly recommend you use a frontier model like claude sonnet/opus 4.5, chatgpt5

or if you want to try a free model, use https://openrouter.ai/ get an api key and set your model to STRIX_LLM to "openrouter/grok-4.1-fast:free"

further details on model comparision https://artificialanalysis.ai/models/comparisons/grok-4-1-fast-reasoning-vs-qwen3-0.6b-instruct-reasoning

yokoszn avatar Nov 28 '25 02:11 yokoszn