Appearance
Anthropic Integration
Anthropic provides state-of-the-art language models with a focus on safety and sustainability. GateFlow integrates seamlessly with Anthropic's API to provide optimized routing, cost control, and observability.
Current Models
Chat Models
| Model | Context | Output | Input $/1M | Output $/1M |
|---|---|---|---|---|
claude-opus-4-5-20251107 | 200k | 8,192 | $15.00 | $75.00 |
claude-opus-4-1-20250805 | 200k | 8,192 | $12.00 | $60.00 |
claude-sonnet-4-5-20250929 | 200k | 8,192 | $3.00 | $15.00 |
claude-sonnet-4-20250514 | 200k | 8,192 | $2.50 | $12.50 |
claude-haiku-4-5-20251015 | 200k | 8,192 | $0.25 | $1.25 |
Setup
- Navigate to Dashboard: Go to Settings → Providers
- Add Provider: Click "Add Provider" and select Anthropic
- Enter API Key: Paste your Anthropic API key
- Configure Settings:
- Region: Select preferred data center (e.g.,
us-east-1) - Sustainability Mode: Enable to prioritize low-carbon regions
- Default Model: Set default model for chat completions
- Region: Select preferred data center (e.g.,
- Save: The provider is now available for routing
Configuration Options
json
{
"provider": "anthropic",
"api_key": "your-anthropic-api-key",
"region": "us-east-1",
"sustainability_mode": true,
"default_model": "claude-sonnet-4-5-20250929",
"quality_threshold": 0.95,
"carbon_budget": 50
}Best Practices
- Cost Optimization: Use
claude-haiku-4-5-20251015for high-volume, low-latency tasks. - Quality Tasks: Use
claude-opus-4-5-20251107for complex reasoning and analysis. - Sustainability: Enable sustainability mode to route requests to low-carbon data centers.
Example Usage
python
from openai import OpenAI
client = OpenAI(
base_url="https://api.gateflow.ai/v1",
api_key="gw_prod_your_key_here"
)
# Use Anthropic for complex reasoning
response = client.chat.completions.create(
model="claude-opus-4-5-20251107",
messages=[{"role": "user", "content": "Solve this complex problem"}],
routing_mode="sustain_optimized"
)
print(response.choices[0].message.content)
print(f"Model used: {response.model}")
print(f"Carbon footprint: {response.sustainability.carbon_gco2e} gCO₂e")
print(f"Carbon saved: {response.sustainability.carbon_saved_gco2e} gCO₂e")Anthropic-Specific Features
Tool Use
Anthropic models support function calling with GateFlow's unified interface:
python
# Define tools (works across all providers)
tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get weather information for a location",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
}
}
}
}
]
response = client.chat.completions.create(
model="claude-opus-4-5-20251107",
messages=[{"role": "user", "content": "What's the weather in Paris?"}],
tools=tools,
routing_mode="sustain_optimized"
)Vision Support
python
# Multi-modal input with vision
response = client.chat.completions.create(
model="claude-opus-4-5-20251107",
messages=[
{
"role": "user",
"content": [
{"type": "text", "text": "Analyze this image"},
{
"type": "image_url",
"image_url": {
"url": "https://example.com/image.jpg"
}
}
]
}
],
routing_mode="sustain_optimized"
)Sustainability Features
Anthropic models through GateFlow offer:
- Carbon-Optimized Routing: Automatically select the most energy-efficient data center
- Model Efficiency: Claude models are optimized for low energy consumption
- Time-Shifted Execution: Defer non-urgent requests to low-carbon periods
- Request Batching: Combine multiple requests for reduced overhead
- Automatic Model Selection: Choose the most efficient Claude model for your task
Model Selection Guide
| Use Case | Recommended Model | Key Features | Sustainability Benefits |
|---|---|---|---|
| Complex reasoning | claude-opus-4-5-20251107 | 200K context, vision | Highest efficiency per token |
| Production use | claude-sonnet-4-5-20250929 | Balanced performance | Best quality-to-carbon ratio |
| Fast responses | claude-haiku-4-5-20251015 | 250ms latency | Lowest carbon footprint |
| Cost-effective | claude-sonnet-4-20250514 | Balanced quality | Optimized for efficiency |
Sustainability Best Practices
Optimization Strategies
- Right-size your model: Use
claude-haiku-4-5-20251015for simple tasks instead of Opus models - Enable Sustain Mode: Let GateFlow automatically choose the most efficient Anthropic model
- Use time-shifting: Defer non-urgent requests to low-carbon periods
- Batch requests: Process multiple items in single API calls to reduce overhead
- Combine with caching: Cache frequent Anthropic requests for maximum savings
Configuration Example
python
# Configure Anthropic provider with sustainability settings
response = client.chat.completions.create(
model="anthropic:auto", # Let GateFlow choose most efficient Anthropic model
messages=[{"role": "user", "content": "Process this sustainably"}],
routing_mode="sustain_optimized",
minimum_quality_score=8, # Balance quality and efficiency
region_preference="us-west" # Prioritize low-carbon regions
)Performance Characteristics
Latency Comparison
- Fastest:
claude-haiku-4-5-20251015(250ms) - Balanced:
claude-sonnet-4-5-20250929(1,400ms) - Advanced:
claude-opus-4-5-20251107(2,000ms)
Token Limits
- All models: 200K context window
- Output limits: 8,192 tokens
Pricing Overview
- Input prices: $0.25-$15.00 per 1M tokens
- Output prices: $1.25-$75.00 per 1M tokens
Integration with Other GateFlow Features
Multi-Provider Fallbacks
python
# Configure Anthropic as primary with fallbacks
response = client.chat.completions.create(
model="claude-opus-4-5-20251107", # Primary: Anthropic
messages=[{"role": "user", "content": "Important request"}],
fallback_providers=["openai", "mistral"], # Fallback chain
routing_mode="sustain_optimized"
)Semantic Caching
python
# Cache frequent Anthropic requests
response = client.chat.completions.create(
model="claude-sonnet-4-5-20250929",
messages=[{"role": "user", "content": "Frequently asked question"}],
cache_ttl_seconds=3600, # Cache for 1 hour
embedding_model="text-embedding-3-small" # Use for semantic matching
)Troubleshooting
"Anthropic API key not configured"
Solution: Add your Anthropic API key in the GateFlow Dashboard under Settings → Providers.
"Model not found: claude-3-opus-20240229"
Solution: Use current models like claude-opus-4-5-20251107 instead of deprecated models.
"Rate limit exceeded"
Solution:
- Check your Anthropic account limits
- Configure fallbacks to other providers
- Enable request queuing in GateFlow settings
- Use
claude-haiku-4-5-20251015for high-volume applications
"Carbon savings lower than expected"
Solution:
- Verify Sustain Mode is properly configured
- Check grid carbon intensity in your region
- Try different Anthropic models for better efficiency
- Enable time-shifted execution for non-urgent requests
Migration from Direct Anthropic API
Key Differences
| Feature | Direct Anthropic API | GateFlow Anthropic Integration |
|---|---|---|
| API Format | Anthropic-specific | OpenAI-compatible |
| Authentication | Anthropic API key | GateFlow API key |
| Model Names | claude-3-opus | claude-opus-4-5-20251107 |
| Carbon Tracking | Manual | Automatic |
| Multi-provider | No | Yes |
| Fallbacks | Manual | Automatic |
| Sustainability | Basic | Advanced optimization |
Migration Example
Before (Direct Anthropic API):
python
import anthropic
client = anthropic.Anthropic(api_key="your-anthropic-api-key")
response = client.messages.create(
model="claude-3-opus-20240229",
messages=[{"role": "user", "content": "Hello from Anthropic!"}]
)After (GateFlow Integration):
python
from openai import OpenAI
client = OpenAI(
base_url="https://api.gateflow.ai/v1",
api_key="gw_prod_your_gateflow_key"
)
response = client.chat.completions.create(
model="claude-opus-4-5-20251107", # Use current models
messages=[{"role": "user", "content": "Hello from Anthropic via GateFlow with sustainability benefits!"}],
routing_mode="sustain_optimized" # Enable carbon optimization
)Next Steps
- Explore OpenAI Integration - Versatile AI models
- Try Google Gemini Models - Large context capabilities
- Configure Sustain Mode - Automatic carbon optimization
- View Provider Analytics - Monitor your Anthropic carbon savings