strix
strix copied to clipboard
Open-source AI agents for penetration testing
### Description: `litellm.BadRequestError: PerplexityException - After the (optional) system message(s), user or tool message(s) should alternate with assistant message(s).` My assumption is that this issue could be related to Strix...
**Is your feature request related to a problem? Please describe.** I would like to be able to provide much detailed instructions which does not fit in a single line argument....
**Describe the bug** Despite Strix requires Docker to run commands like `strix --target `, I cannot run it inside a `docker:dind` container. **To Reproduce** Steps to reproduce the behavior: 1....
**Describe the bug** When using Strix with certain LLMs (e.g., GPT-5, GPT-4.1, or LiteLLM proxy), the model returns an error: Invalid request: You have passed a message containing tags in...
**Describe the bug** If the API_KEY is wrong in the first run, the image will be emptied. To run again, need to pull image layer again. **To Reproduce** Steps to...
strix -t https://70.179.6.240/ config env declare LLM_API_BASE="http://70.189.82.120:52001/v1" declare LLM_API_KEY="XXXXXX" However, the following error was reported. How exactly should the local large model be configured? Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new...
Does the default LLM configuration information for GPT-5 support manual modification? If so, how is it configured?
Work on #29 ability to record checkpoints of strix and resume execution from the last state. The PR includes two implementations of the CheckpointStore interface: SQLite and a file store....