Prerequisites
- Docker (running)
- An LLM provider API key (OpenAI, Anthropic, or local model)
Installation
curl -sSL https://strix.ai/install | bash
Configuration
Set your LLM provider:
export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="your-api-key"
For best results, use openai/gpt-5 or anthropic/claude-sonnet-4-5.
Run Your First Scan
strix --target ./your-app
First run pulls the Docker sandbox image automatically. Results are saved to strix_runs/<run-name>.
Target Types
Strix accepts multiple target types:
# Local codebase
strix --target ./app-directory
# GitHub repository
strix --target https://github.com/org/repo
# Live web application
strix --target https://your-app.com
# Multiple targets (white-box testing)
strix -t https://github.com/org/repo -t https://your-app.com
Next Steps