Skip to main content
Strix Router gives you access to the best LLMs through a single API key.
Strix Router is currently in beta. It’s completely optional — Strix works with any LiteLLM-compatible provider using your own API keys, or with local models. Strix Router is just the setup we test and optimize for.

Why Use Strix Router?

  • High rate limits — No throttling during long-running scans
  • Zero data retention — Routes to providers with zero data retention policies enabled
  • Failover & load balancing — Automatic fallback across providers for reliability
  • Simple setup — One API key, one environment variable, no provider accounts needed
  • No markup — Same token pricing as the underlying providers, no extra fees
  • $10 free credit — Try it free on signup, no credit card required

Quick Start

  1. Get your API key at models.strix.ai
  2. Set your environment:
export LLM_API_KEY='your-strix-api-key'
export STRIX_LLM='strix/gpt-5'
  1. Run a scan:
strix --target ./your-app

Available Models

Anthropic

ModelID
Claude Sonnet 4.6strix/claude-sonnet-4.6
Claude Opus 4.6strix/claude-opus-4.6

OpenAI

ModelID
GPT-5.2strix/gpt-5.2
GPT-5.1strix/gpt-5.1
GPT-5strix/gpt-5
GPT-5.2 Codexstrix/gpt-5.2-codex
GPT-5.1 Codex Maxstrix/gpt-5.1-codex-max
GPT-5.1 Codexstrix/gpt-5.1-codex
GPT-5 Codexstrix/gpt-5-codex

Google

ModelID
Gemini 3 Prostrix/gemini-3-pro-preview
Gemini 3 Flashstrix/gemini-3-flash-preview

Other

ModelID
GLM-5strix/glm-5
GLM-4.7strix/glm-4.7

Configuration Reference

LLM_API_KEY
string
required
Your Strix API key from models.strix.ai.
STRIX_LLM
string
required
Model ID from the tables above. Must be prefixed with strix/.