strix icon indicating copy to clipboard operation
strix copied to clipboard

Strix does not perform automated scans with Ollama2 local model on Windows 11

Open razvynmk-ui opened this issue 1 month ago • 2 comments

Strix does not perform automated scans with Ollama2 local model on Windows 11

Describe the bug When trying to use Strix with Ollama2 as the local model on Windows 11, the agent starts and interacts automatically but does not perform the intended scan on the target. The AI seems to “talk alone” and does not carry out the security audit or scanning tasks as expected.

To Reproduce Steps to reproduce the behavior:

  1. Install Strix on Windows 11.

  2. Set up Ollama2 locally with the model available.

  3. Configure Strix environment variables:

    • STRIX_LLM=ollama/ollama2
    • LLM_API_BASE=http://localhost:11434
    • LLM_API_KEY=local
  4. Run the command:

    strix -n -t 192.168.1.141

    or with instructions:

    strix -n -t 192.168.1.141 --instruction "Perform authorized security audit only"

Expected behavior Strix should use the Ollama2 model to perform a structured, automated scan of the target (192.168.1.141), detect vulnerabilities, and save results in the agent_runs folder. The agent should not just output text without scanning.

System Information:

  • OS: Windows 11 ** Python Version: 3.12
  • LLM Used: Ollama2 local

Additional context

  • Ollama2 is running and reachable at http://localhost:11434.
  • Strix seems to recognize the model (AI outputs responses) but does not execute scanning tasks.
  • This may indicate that Strix is not recognizing the local Ollama2 model as capable of performing automated scanning.

razvynmk-ui avatar Nov 06 '25 16:11 razvynmk-ui

Hi @razvynmk-ui 👋

I noticed that Strix connects perfectly with the local Ollama2 model but still doesn’t trigger the scanning executor. Ollama2 is clearly responding, so the API setup (LLM_API_BASE=http://localhost:11434 and LLM_API_KEY=local) is working fine. The issue seems to be that local Ollama models like Ollama2 don’t return structured tool-call or JSON outputs — they only respond with plain text. Because of that, Strix receives the message but doesn’t execute any scanning actions.

To fix or test it, I tried a few things:

  • Forced function-call mode by setting set STRIX_FUNCTION_MODE=true
  • Switched to a model that supports structured responses like STRIX_LLM=ollama/codellama:13b-instruct
  • Ran Strix in debug mode for more details strix -n -t 192.168.1.141 --debug
  • Also made sure scanner tools like nmap are accessible via PATH using where nmap

From what I can tell, the root cause is that Strix isn’t interpreting Ollama2’s plain text output as executable scan instructions. Maybe adding a compatibility layer or fallback parser for local LLMs could solve this in future updates. Overall, everything else works great — just this execution trigger part seems missing.

shreeradhika623-sudo avatar Nov 08 '25 17:11 shreeradhika623-sudo