Skip to main content
Strix uses LiteLLM for model compatibility.
ModelProviderNotes
openai/gpt-5OpenAIBest overall performance
anthropic/claude-sonnet-4-5AnthropicExcellent reasoning

Cloud Providers

OpenAI

export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="sk-..."

Anthropic

export STRIX_LLM="anthropic/claude-sonnet-4-5"
export LLM_API_KEY="sk-ant-..."

Google Vertex AI

export STRIX_LLM="vertex_ai/gemini-pro"
# Uses application default credentials

AWS Bedrock

export STRIX_LLM="bedrock/anthropic.claude-3-sonnet-20240229-v1:0"
# Uses AWS credentials from environment

Local Models

Ollama

export STRIX_LLM="ollama/llama3"
export LLM_API_BASE="http://localhost:11434"

LM Studio

export STRIX_LLM="openai/local-model"
export LLM_API_BASE="http://localhost:1234/v1"
Local models may have reduced performance compared to GPT-5 or Claude. Test thoroughly before production use.

Model Format

Use LiteLLM format: provider/model-name
openai/gpt-5
anthropic/claude-sonnet-4-5
vertex_ai/gemini-pro
ollama/llama3