Strix Router (Recommended)
The fastest way to get started. Strix Router gives you access to tested models with the highest rate limits and zero data retention.Bring Your Own Key
You can also use any LiteLLM-compatible provider with your own API keys:| Model | Provider | Configuration |
|---|---|---|
| GPT-5 | OpenAI | openai/gpt-5 |
| Claude Sonnet 4.6 | Anthropic | anthropic/claude-sonnet-4-6 |
| Gemini 3 Pro | Google Vertex | vertex_ai/gemini-3-pro-preview |
Local Models
Run models locally with Ollama, LM Studio, or any OpenAI-compatible server:Provider Guides
Strix Router
Recommended models router with high rate limits.
OpenAI
GPT-5 and Codex models.
Anthropic
Claude Opus, Sonnet, and Haiku.
OpenRouter
Access 100+ models through a single API.
Google Vertex AI
Gemini 3 models via Google Cloud.
AWS Bedrock
Claude and Titan models via AWS.
Azure OpenAI
GPT-5 via Azure.
Local Models
Llama 4, Mistral, and self-hosted models.
Model Format
Use LiteLLM’sprovider/model-name format: