Skip to main content

Prerequisites

  • Docker (running)
  • An LLM API key from any supported provider (OpenAI, Anthropic, Google, etc.)

Installation

curl -sSL https://strix.ai/install | bash

Configuration

Set your LLM provider:
export STRIX_LLM="openai/gpt-5.4"
export LLM_API_KEY="your-api-key"
For best results, use openai/gpt-5.4, anthropic/claude-opus-4-6, or openai/gpt-5.2.

Run Your First Scan

strix --target ./your-app
First run pulls the Docker sandbox image automatically. Results are saved to strix_runs/<run-name>.

Target Types

Strix accepts multiple target types:
# Local codebase
strix --target ./app-directory

# GitHub repository
strix --target https://github.com/org/repo

# Live web application
strix --target https://your-app.com

# Multiple targets (white-box testing)
strix -t https://github.com/org/repo -t https://your-app.com

Next Steps

CLI Options

Explore all command-line options.

Scan Modes

Choose the right scan depth.