@@ -15,6 +15,8 @@ claude plugin install adversarial-spec
1515
1616# 2. Set at least one API key
1717export OPENAI_API_KEY=" sk-..."
18+ # Or use OpenRouter for access to multiple providers with one key
19+ export OPENROUTER_API_KEY=" sk-or-..."
1820
1921# 3. Run it
2022/adversarial-spec " Build a rate limiter service with Redis backend"
@@ -57,16 +59,17 @@ You describe product --> Claude drafts spec --> Multiple LLMs critique in parall
5759
5860## Supported Models
5961
60- | Provider | Env Var | Example Models |
61- | -----------| ----------------------| ----------------------------------------------|
62- | OpenAI | ` OPENAI_API_KEY ` | ` gpt-4o ` , ` gpt-4-turbo ` , ` o1 ` |
63- | Anthropic | ` ANTHROPIC_API_KEY ` | ` claude-sonnet-4-20250514 ` , ` claude-opus-4-20250514 ` |
64- | Google | ` GEMINI_API_KEY ` | ` gemini/gemini-2.0-flash ` , ` gemini/gemini-pro ` |
65- | xAI | ` XAI_API_KEY ` | ` xai/grok-3 ` , ` xai/grok-beta ` |
66- | Mistral | ` MISTRAL_API_KEY ` | ` mistral/mistral-large ` , ` mistral/codestral ` |
67- | Groq | ` GROQ_API_KEY ` | ` groq/llama-3.3-70b-versatile ` |
68- | Deepseek | ` DEEPSEEK_API_KEY ` | ` deepseek/deepseek-chat ` |
69- | Zhipu | ` ZHIPUAI_API_KEY ` | ` zhipu/glm-4 ` , ` zhipu/glm-4-plus ` |
62+ | Provider | Env Var | Example Models |
63+ | ------------| ------------------------| ----------------------------------------------|
64+ | OpenAI | ` OPENAI_API_KEY ` | ` gpt-4o ` , ` gpt-4-turbo ` , ` o1 ` |
65+ | Anthropic | ` ANTHROPIC_API_KEY ` | ` claude-sonnet-4-20250514 ` , ` claude-opus-4-20250514 ` |
66+ | Google | ` GEMINI_API_KEY ` | ` gemini/gemini-2.0-flash ` , ` gemini/gemini-pro ` |
67+ | xAI | ` XAI_API_KEY ` | ` xai/grok-3 ` , ` xai/grok-beta ` |
68+ | Mistral | ` MISTRAL_API_KEY ` | ` mistral/mistral-large ` , ` mistral/codestral ` |
69+ | Groq | ` GROQ_API_KEY ` | ` groq/llama-3.3-70b-versatile ` |
70+ | OpenRouter | ` OPENROUTER_API_KEY ` | ` openrouter/openai/gpt-4o ` , ` openrouter/anthropic/claude-3.5-sonnet ` |
71+ | Deepseek | ` DEEPSEEK_API_KEY ` | ` deepseek/deepseek-chat ` |
72+ | Zhipu | ` ZHIPUAI_API_KEY ` | ` zhipu/glm-4 ` , ` zhipu/glm-4-plus ` |
7073
7174Check which keys are configured:
7275
@@ -97,6 +100,33 @@ When Bedrock is enabled, **all model calls route through Bedrock** - no direct A
97100
98101Configuration is stored at ` ~/.claude/adversarial-spec/config.json ` .
99102
103+ ## OpenRouter Support
104+
105+ [ OpenRouter] ( https://openrouter.ai ) provides unified access to multiple LLM providers through a single API. This is useful for:
106+ - Accessing models from multiple providers with one API key
107+ - Comparing models across different providers
108+ - Automatic fallback and load balancing
109+ - Cost optimization across providers
110+
111+ ** Setup:**
112+
113+ ``` bash
114+ # Get your API key from https://openrouter.ai/keys
115+ export OPENROUTER_API_KEY=" sk-or-..."
116+
117+ # Use OpenRouter models (prefix with openrouter/)
118+ python3 debate.py critique --models openrouter/openai/gpt-4o,openrouter/anthropic/claude-3.5-sonnet < spec.md
119+ ```
120+
121+ ** Popular OpenRouter models:**
122+ - ` openrouter/openai/gpt-4o ` - GPT-4o via OpenRouter
123+ - ` openrouter/anthropic/claude-3.5-sonnet ` - Claude 3.5 Sonnet
124+ - ` openrouter/google/gemini-2.0-flash ` - Gemini 2.0 Flash
125+ - ` openrouter/meta-llama/llama-3.3-70b-instruct ` - Llama 3.3 70B
126+ - ` openrouter/qwen/qwen-2.5-72b-instruct ` - Qwen 2.5 72B
127+
128+ See the full model list at [ openrouter.ai/models] ( https://openrouter.ai/models ) .
129+
100130## OpenAI-Compatible Endpoints
101131
102132For models that expose an OpenAI-compatible API (local LLMs, self-hosted models, alternative providers), set ` OPENAI_API_BASE ` :
0 commit comments