Strix uses LiteLLM for model compatibility.
Recommended Models
| Model | Provider | Notes |
|---|
openai/gpt-5 | OpenAI | Best overall performance |
anthropic/claude-sonnet-4-5 | Anthropic | Excellent reasoning |
Cloud Providers
OpenAI
export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="sk-..."
Anthropic
export STRIX_LLM="anthropic/claude-sonnet-4-5"
export LLM_API_KEY="sk-ant-..."
Google Vertex AI
export STRIX_LLM="vertex_ai/gemini-pro"
# Uses application default credentials
AWS Bedrock
export STRIX_LLM="bedrock/anthropic.claude-3-sonnet-20240229-v1:0"
# Uses AWS credentials from environment
Local Models
Ollama
export STRIX_LLM="ollama/llama3"
export LLM_API_BASE="http://localhost:11434"
LM Studio
export STRIX_LLM="openai/local-model"
export LLM_API_BASE="http://localhost:1234/v1"
Local models may have reduced performance compared to GPT-5 or Claude. Test thoroughly before production use.
Use LiteLLM format: provider/model-name
openai/gpt-5
anthropic/claude-sonnet-4-5
vertex_ai/gemini-pro
ollama/llama3