[BUG]litellm.BadRequestError - LLM API Rejects Messages Containing <channel> Tags
Describe the bug When using Strix with certain LLMs (e.g., GPT-5, GPT-4.1, or LiteLLM proxy), the model returns an error:
Invalid request: You have passed a message containing
Strix appears to send internal ChatGPT-style tags such as:
<analysis>
<final>
<assistant>
<user>
<channel>
These tags are not valid in OpenAI-compatible APIs, causing a 400 Bad Request.
To Reproduce
- Create a Strix agent (e.g., PentestAgent or custom agent).
- Execute a task that produces chain-of-thought–style output or internal tags.
- Strix sends the full output (including
/ ) to the LLM backend. - LLM responds with an error:
Invalid request: message contains <channel> tags
Expected behavior A clear and concise description of what you expected to happen.
Screenshots Added screenshot for refernce
System Information
OS: e.g., Kali Linux 2024.3 Strix Version / Commit: e.g., 0.1.18 Python Version: e.g., 3.12 LLM Used: e.g., GPT-5 via LiteLLM (OpenAI-compatible endpoint)
- Python Version: [e.g. 3.12]
- LLM Used: "openai/openai/gpt-oss-20b"
Any idea why this is happening here?