Bridge any AI model to Claude Code CLI — route through OpenRouter, DeepSeek, OpenAI, Gemini, or any Anthropic-compatible endpoint.
Available in: English | Українська | Русский
Claude Code CLI is hardcoded for Anthropic's API. It sends requests in Anthropic's Messages API format, expects responses in that exact format, and authenticates using Anthropic's method. If you want to use any other model — DeepSeek, GPT-4, Gemini, GLM, Llama, Qwen — you hit three walls:
Every AI provider has its own API format. DeepSeek, OpenAI, and Gemini use the OpenAI-compatible format. Chinese models (GLM, Qwen) have their own APIs. Open-source models (Llama, Mistral) each have unique response structures.
Claude Code expects Anthropic's specific streaming events (content_block_delta), content block types, tool_use structures, stop_reason values, and token counting fields. Send it a raw DeepSeek or OpenAI response — it crashes.
The fix: Providers like OpenRouter and DeepSeek offer Anthropic-compatible proxy endpoints — they accept requests in Anthropic format, route them to any model, and translate the response back into Anthropic format. claude-flow knows exactly which endpoints to use for each provider.
Claude Code's auth has an undocumented quirk: ANTHROPIC_API_KEY must be set to an empty string ("") for proxy providers — not absent, not unset. Specifically an empty string.
- If
ANTHROPIC_API_KEYis absent → Claude Code errors out on startup - If
ANTHROPIC_API_KEYis non-empty → Claude Code uses it and silently ignores your proxy token - If
ANTHROPIC_API_KEYis""→ Claude Code falls back toANTHROPIC_AUTH_TOKEN✓
Different providers also use different auth methods — Bearer token (OpenRouter) vs. API key header (DeepSeek, OpenAI). Getting any of this wrong means silent failures.
6 environment variables, zero official documentation, different combinations for each provider, and non-obvious behaviors. One wrong setting = Claude Code either crashes or silently talks to the wrong endpoint.
claude-flow solves all three. One command configures everything correctly for any provider.
# Install from GitHub
npm install -g github:Lexus2016/claude-flow
# Setup your provider (interactive)
claude-flow setup openrouter
# Run Claude Code
eval $(claude-flow env)
claude -p "Hello from OpenRouter!"That's it. Now Claude Code uses your provider for every invocation.
Important: The
claude-flowname on npm belongs to an unrelated project. Always install from GitHub to get the correct package.
npm install -g github:Lexus2016/claude-flowgit clone https://github.com/Lexus2016/claude-flow.git
cd claude-flow
npm link
claude-flow setup openrouter- Node.js >= 18.0
- Claude Code CLI (from Anthropic) — install here
- An API key for your chosen provider (see Supported Providers)
npm install -g github:Lexus2016/claude-flowclaude-flow setup openrouterYou'll be prompted for:
- Your API key for the provider (e.g.,
sk-or-v1-...) - Model preferences for each tier (haiku, sonnet, opus) — or press Enter for defaults
Example session:
claude-flow — Setup OpenRouter
Get your API key: https://openrouter.ai/keys
API Key: sk-or-v1-abc123def456...
Model configuration (press Enter for defaults)
Claude Code uses 3 model tiers: haiku (fast), sonnet (balanced), opus (powerful)
Popular models for OpenRouter:
anthropic/claude-sonnet-4-6 sonnet
anthropic/claude-opus-4-6 opus
anthropic/claude-haiku-4-5-20251001 haiku
google/gemini-2.5-pro opus
deepseek/deepseek-chat sonnet
Haiku model [anthropic/claude-haiku-4-5-20251001]:
Sonnet model [anthropic/claude-sonnet-4-6]:
Opus model [anthropic/claude-opus-4-6]:
✓ Configuration saved to /Users/you/.claude-flow/config.json
✓ Active provider: OpenRouter
Environment variables (what Claude Code will use):
ANTHROPIC_BASE_URL https://openrouter.ai/api
ANTHROPIC_AUTH_TOKEN sk-or-v1-abc...
ANTHROPIC_API_KEY "" ← intentionally empty
ANTHROPIC_DEFAULT_HAIKU_MODEL anthropic/claude-haiku-4-5-20251001
ANTHROPIC_DEFAULT_SONNET_MODEL anthropic/claude-sonnet-4-6
ANTHROPIC_DEFAULT_OPUS_MODEL anthropic/claude-opus-4-6
Quick start:
# Option 1: Run directly
claude-flow run -- claude -p 'Hello from OpenRouter!'
# Option 2: Add to your shell profile
echo 'eval "$(claude-flow env)"' >> ~/.bashrc
# Option 3: One-time eval
eval $(claude-flow env)
Choose one of three methods:
claude-flow run -- claude -p "What is the weather?"eval $(claude-flow env)
claude -p "Build me a React component"# Add to ~/.bashrc, ~/.zshrc, or ~/.fish/config.fish
echo 'eval "$(claude-flow env)"' >> ~/.bashrc
# Then reload your shell
source ~/.bashrc
# Now Claude Code works everywhere
claude -p "Hello!"| Provider | Model Selection | Cost | Setup | Popular For |
|---|---|---|---|---|
| OpenRouter | 200+ models from all providers | Varies | Easy | One API for everything |
| DeepSeek | V3.2, R1 (reasoning) | Cheapest | Easy | Cost-effective reasoning |
| OpenAI | GPT-5.2, o3 | Premium | Easy | Cutting-edge models |
| Gemini | Gemini 3.1 Pro / 3 Flash | Competitive | Easy | Fast inference |
| Custom | Any Anthropic-compatible endpoint | Varies | Manual | Self-hosted, private clouds |
Access 200+ models from a single API endpoint.
Perfect if you want flexibility — use Claude, GPT-4, DeepSeek, Gemini, Llama, Qwen, and more with a single key. Pay per use.
Get API key: https://openrouter.ai/keys
Setup:
claude-flow setup openrouterDefault models:
- Haiku:
anthropic/claude-haiku-4-5-20251001-5-20251001 - Sonnet:
anthropic/claude-sonnet-4-6-6 - Opus:
anthropic/claude-opus-4-6-6
Popular models to try:
claude-flow models openrouter Default tier mapping:
haiku → anthropic/claude-haiku-4-5-20251001-5-20251001
sonnet → anthropic/claude-sonnet-4-6-6
opus → anthropic/claude-opus-4-6-6
Popular models:
Model ID Name Tier
─────────────────────────────────────────────────────────────────────
anthropic/claude-opus-4-6-6 Claude Opus 4.6 opus
anthropic/claude-sonnet-4-6-6 Claude Sonnet 4.6 sonnet
anthropic/claude-haiku-4-5-20251001-5-20251001 Claude Haiku 4.5 haiku
google/gemini-3.1-pro-preview Gemini 3.1 Pro opus
google/gemini-3-flash-preview Gemini 3 Flash sonnet
openai/gpt-5.2 GPT-5.2 opus
openai/gpt-5.2-mini GPT-5.2 Mini sonnet
deepseek/deepseek-v3.2 DeepSeek V3.2 sonnet
deepseek/deepseek-r1-0528 DeepSeek R1 opus
minimax/minimax-m2.5 MiniMax M2.5 opus
moonshotai/kimi-k2.5 Kimi K2.5 opus
z-ai/glm-5 GLM-5 opus
qwen/qwen3.5-plus Qwen 3.5 sonnet
meta-llama/llama-4-maverick Llama 4 Maverick sonnet
z-ai/glm-4.5-air GLM-4.5 Air haiku
Use different model per tier:
claude-flow setup openrouter
# When prompted:
# Haiku model: google/gemini-2.5-flash
# Sonnet model: deepseek/deepseek-v3.2
# Opus model: openai/gpt-5.2Ultra-fast and cheap. Great for reasoning tasks.
DeepSeek V3.2 (fast, hybrid thinking) and R1 (deep reasoning) through their native Anthropic-compatible API. Single digit cent costs.
Get API key: https://platform.deepseek.com/api_keys
Setup:
claude-flow setup deepseekDefault models:
- Haiku:
deepseek-chat - Sonnet:
deepseek-v3.2 - Opus:
deepseek-r1-0528
Why use DeepSeek?
- Cheapest LLM pricing available
- DeepSeek V3.2 supports hybrid thinking mode
- DeepSeek R1 reasoning beats most models on benchmarks
- Native Anthropic-compatible API (no proxy markup)
Use R1 for deep reasoning:
claude-flow setup deepseek
# When prompted for Opus, use: deepseek-r1-0528Latest GPT models, including GPT-5.2 and o3.
GPT-5.2 (flagship), GPT-5.2 Thinking (reasoning) and o3 through OpenAI's official API.
Get API key: https://platform.openai.com/api-keys
Setup:
claude-flow setup openaiDefault models:
- Haiku:
gpt-5.2-mini - Sonnet:
gpt-5.2 - Opus:
gpt-5.2
Popular models:
claude-flow models openai Default tier mapping:
haiku → gpt-5.2-mini
sonnet → gpt-5.2
opus → gpt-5.2
Popular models:
Model ID Name Tier
─────────────────────────────────────────────────────────────────────
gpt-5.2 GPT-5.2 opus
gpt-5.2-mini GPT-5.2 Mini sonnet
gpt-5.2-thinking GPT-5.2 Thinking opus
o3 o3 opus
gpt-4.1 GPT-4.1 sonnet
Google's latest models. Fast and capable.
Gemini 3.1 Pro (latest flagship), Gemini 3 Flash, and Gemini 2.5 through Google's API.
Get API key: https://aistudio.google.com/apikey
Setup:
claude-flow setup geminiDefault models:
- Haiku:
gemini-2.5-flash - Sonnet:
gemini-3-flash-preview - Opus:
gemini-3.1-pro-preview
Any Anthropic-compatible endpoint (self-hosted, private clouds, etc.).
If you have your own Claude-compatible endpoint, or want to use a provider not listed above:
claude-flow setup customYou'll be prompted for:
- API base URL (e.g.,
https://your-proxy.local/v1) - Model IDs
- API key
claude-flow helpShows all commands and examples.
claude-flow setup <provider>Interactive setup wizard. Guides you through getting an API key and choosing models.
Providers: openrouter, deepseek, openai, gemini, custom
Examples:
claude-flow setup openrouter
claude-flow setup deepseek
claude-flow setup openaiPrint shell export statements. Used for eval or adding to your shell profile.
# Print to stdout (copy-paste friendly)
claude-flow env
# Or use in a command
eval $(claude-flow env)
# Or for a specific provider
claude-flow env deepseek
eval $(claude-flow env deepseek)Output example:
export ANTHROPIC_BASE_URL='https://openrouter.ai/api'
export ANTHROPIC_AUTH_TOKEN='sk-or-v1-...'
export ANTHROPIC_API_KEY=''
export ANTHROPIC_DEFAULT_HAIKU_MODEL='anthropic/claude-haiku-4-5-20251001'
export ANTHROPIC_DEFAULT_SONNET_MODEL='anthropic/claude-sonnet-4-6'
export ANTHROPIC_DEFAULT_OPUS_MODEL='anthropic/claude-opus-4-6'Run a command with provider environment variables set.
claude-flow run -- <command> [args...]The -- separator is important — everything after it is your command.
Examples:
# Run Claude Code once with OpenRouter
claude-flow run -- claude -p "Hello from OpenRouter!"
# Run with a different provider
claude-flow run deepseek -- claude -p "Use DeepSeek for this"
# Run any command (not just Claude)
claude-flow run -- node my-script.js
claude-flow run -- python my-app.py
# Chain with other commands
claude-flow run -- bash -c 'echo $ANTHROPIC_BASE_URL && claude -p "Test"'Show current configuration and active provider.
claude-flow statusOutput example:
claude-flow status
Config: /Users/you/.claude-flow/config.json
Active: openrouter
● OpenRouter Key: sk-or-v1-abc...def
haiku=anthropic/claude-haiku-4-5-20251001 sonnet=anthropic/claude-sonnet-4-6 opus=anthropic/claude-opus-4-6
○ DeepSeek Key: sk-...xyz
haiku=deepseek-chat sonnet=deepseek-chat opus=deepseek-reasoner
Switch to a different provider (must be configured first).
claude-flow switch <provider>Examples:
claude-flow switch deepseek
# Now eval $(claude-flow env) will use DeepSeekList all available providers with details.
claude-flow providersOutput:
Available Providers
● openrouter 200+ models from every major provider through one API
Auth: Bearer | Base: https://openrouter.ai/api | Key env: OPENROUTER_API_KEY
○ deepseek DeepSeek models via native Anthropic-compatible API
Auth: API key | Base: https://api.deepseek.com/anthropic | Key env: DEEPSEEK_API_KEY
○ openai OpenAI models (GPT-4.1, o3, etc.)
Auth: API key | Base: https://api.openai.com/v1 | Key env: OPENAI_API_KEY
○ gemini Gemini models via OpenAI-compatible endpoint
Auth: API key | Base: https://generativelanguage.googleapis.com/v1beta | Key env: GEMINI_API_KEY
○ custom Any Anthropic-compatible API endpoint
Auth: Bearer | Base: (custom) | Key env: CUSTOM_API_KEY
● = active provider
Browse available models for a provider.
claude-flow models [provider]If provider is omitted, shows models for the active provider.
Example:
claude-flow models openrouter
claude-flow models deepseekShow installed version.
claude-flow --version
# or
claude-flow -vWithout claude-flow:
Claude Code CLI → Anthropic API → Claude models only
(no other models possible)
With claude-flow:
┌─────────────┐ Anthropic ┌──────────────────────┐ Native ┌──────────────┐
│ Claude Code │ ── Messages ──→ │ Provider's │ ── API ───→ │ Any Model │
│ CLI │ API format │ Anthropic-compatible │ call │ │
│ │ ←─ Anthropic ── │ endpoint │ ←─ Native ─ │ GPT, GLM, │
│ │ format │ │ format │ Llama, Qwen, │
└─────────────┘ │ TRANSLATES responses │ │ DeepSeek, │
│ between formats │ │ Gemini, ... │
└──────────────────────┘ └──────────────┘
▲
claude-flow configures
this connection correctly
The key insight: Claude Code doesn't need to talk to models directly. It talks to an Anthropic-compatible proxy endpoint that handles the format translation. OpenRouter translates 200+ models into Anthropic format. DeepSeek's /anthropic endpoint does the same. claude-flow knows which endpoint to use for each provider and configures Claude Code to connect there.
When you use z-ai/glm-5 through OpenRouter:
- Claude Code sends an Anthropic Messages API request to
https://openrouter.ai/api/v1/messages - OpenRouter receives the request, translates it to GLM-5's native format
- GLM-5 processes and returns a response in its own format
- OpenRouter translates the response back into Anthropic Messages API format
- Claude Code receives a response it understands — streaming events, content blocks, tool_use, everything
Without this translation, Claude Code gets a response in the wrong format and crashes.
Claude Code reads these env vars to determine which API to use:
| Env Var | Purpose | Example |
|---|---|---|
ANTHROPIC_BASE_URL |
API endpoint (where requests go) | https://openrouter.ai/api |
ANTHROPIC_AUTH_TOKEN |
Bearer token (proxy providers) | sk-or-v1-... |
ANTHROPIC_API_KEY |
API key (direct providers) or "" |
sk-... or "" |
ANTHROPIC_DEFAULT_HAIKU_MODEL |
Fast tier model | anthropic/claude-haiku-4-5-20251001 |
ANTHROPIC_DEFAULT_SONNET_MODEL |
Balanced tier model | anthropic/claude-sonnet-4-6 |
ANTHROPIC_DEFAULT_OPUS_MODEL |
Powerful tier model | anthropic/claude-opus-4-6 |
ANTHROPIC_API_KEY must be set to an empty string ("") for proxy providers, not absent or unset.
This is the most common failure point when configuring Claude Code with alternative providers.
- If
ANTHROPIC_API_KEYis absent/unset: Claude Code errors trying to read it - If
ANTHROPIC_API_KEYis non-empty: Claude Code uses it and ignoresANTHROPIC_AUTH_TOKEN - If
ANTHROPIC_API_KEYis empty string (""): Claude Code falls back toANTHROPIC_AUTH_TOKEN
# ✓ CORRECT — Claude Code uses AUTH_TOKEN for OpenRouter
export ANTHROPIC_BASE_URL='https://openrouter.ai/api'
export ANTHROPIC_AUTH_TOKEN='sk-or-v1-...'
export ANTHROPIC_API_KEY='' # Empty string, not absent!
export ANTHROPIC_DEFAULT_SONNET_MODEL='anthropic/claude-sonnet-4-6'
# ✗ WRONG — Missing API_KEY, Claude Code errors
export ANTHROPIC_BASE_URL='https://openrouter.ai/api'
export ANTHROPIC_AUTH_TOKEN='sk-or-v1-...'
# (no ANTHROPIC_API_KEY set at all)
# ✗ WRONG — Non-empty API_KEY, Claude Code ignores AUTH_TOKEN
export ANTHROPIC_BASE_URL='https://openrouter.ai/api'
export ANTHROPIC_AUTH_TOKEN='sk-or-v1-...'
export ANTHROPIC_API_KEY='some-random-value'# ✓ CORRECT — Claude Code uses API_KEY for DeepSeek
export ANTHROPIC_BASE_URL='https://api.deepseek.com/anthropic'
export ANTHROPIC_API_KEY='sk-...' # The actual key
export ANTHROPIC_DEFAULT_SONNET_MODEL='deepseek-chat'The buildEnv() function in lib/env.js:
- Selects the correct endpoint that handles format translation for each provider
- For proxy providers (OpenRouter): Sets
ANTHROPIC_API_KEY = ""andANTHROPIC_AUTH_TOKEN = apiKey - For direct providers (DeepSeek, OpenAI): Sets
ANTHROPIC_API_KEY = apiKey(their endpoints accept Anthropic format natively) - Maps models to tiers — sets
ANTHROPIC_DEFAULT_HAIKU/SONNET/OPUS_MODEL - Returns a complete env object ready to pass to Claude Code — no guesswork, no gotchas
Configuration is stored in ~/.claude-flow/config.json (600 permissions — owner read/write only).
{
"activeProvider": "openrouter",
"providers": {
"openrouter": {
"apiKey": "sk-or-v1-...",
"models": {
"haiku": "anthropic/claude-haiku-4-5-20251001",
"sonnet": "anthropic/claude-sonnet-4-6",
"opus": "anthropic/claude-opus-4-6"
}
},
"deepseek": {
"apiKey": "sk-...",
"models": {
"haiku": "deepseek-chat",
"sonnet": "deepseek-chat",
"opus": "deepseek-reasoner"
}
}
}
}You can edit this file directly. Just remember:
- Keep it secure (contains API keys)
- Use valid JSON
- Restart your shell for changes to take effect
You can also set API keys via environment variables without using the config file:
export OPENROUTER_API_KEY='sk-or-v1-...'
export DEEPSEEK_API_KEY='sk-...'
# Then use claude-flow with those keys
claude-flow run -- claude -p "Hello"Use claude-flow as a library in your Node.js projects (e.g., Claude Code Studio, build tools, CI/CD).
npm install github:Lexus2016/claude-flowconst {
buildEnv,
toShellExports,
mergeEnv,
getProvider,
listProviders,
PROVIDERS,
config,
} = require('claude-flow');Build environment variables for Claude Code.
Arguments:
provider(string): Provider name (openrouter,deepseek,openai,gemini,custom)opts(object):apiKey(string, required): Provider API keyhaiku(string): Model for haiku tier (overrides default)sonnet(string): Model for sonnet tier (overrides default)opus(string): Model for opus tier (overrides default)model(string): Use same model for all tiers (shorthand)baseUrl(string): Override API base URL (custom providers only)
Returns: Object with env vars
Example:
const { buildEnv } = require('claude-flow');
// Use default models
const env = buildEnv('openrouter', {
apiKey: 'sk-or-v1-abc123...'
});
console.log(env);
// {
// ANTHROPIC_BASE_URL: 'https://openrouter.ai/api',
// ANTHROPIC_AUTH_TOKEN: 'sk-or-v1-abc123...',
// ANTHROPIC_API_KEY: '',
// ANTHROPIC_DEFAULT_HAIKU_MODEL: 'anthropic/claude-haiku-4-5-20251001',
// ANTHROPIC_DEFAULT_SONNET_MODEL: 'anthropic/claude-sonnet-4-6',
// ANTHROPIC_DEFAULT_OPUS_MODEL: 'anthropic/claude-opus-4-6'
// }Custom models:
const env = buildEnv('openrouter', {
apiKey: 'sk-or-v1-...',
haiku: 'google/gemini-2.5-flash',
sonnet: 'deepseek/deepseek-chat',
opus: 'openai/gpt-4.1'
});Single model for all tiers:
const env = buildEnv('openrouter', {
apiKey: 'sk-or-v1-...',
model: 'deepseek/deepseek-chat' // All tiers use this
});Custom endpoint:
const env = buildEnv('custom', {
apiKey: 'sk-custom-key',
baseUrl: 'https://your-proxy.local/v1',
haiku: 'your-model-haiku',
sonnet: 'your-model-sonnet',
opus: 'your-model-opus'
});Build provider env vars and merge them into process.env. Returns a new object (doesn't mutate process.env).
Useful for spawn() with custom env.
Example:
const { spawn } = require('child_process');
const { mergeEnv } = require('claude-flow');
const env = mergeEnv('openrouter', {
apiKey: 'sk-or-v1-...'
});
// Run Claude Code with OpenRouter
const child = spawn('claude', ['-p', 'Hello!'], {
env, // Use the merged env
stdio: 'inherit'
});Format env vars as shell export statements.
Arguments:
env(object): Env object frombuildEnv()
Returns: String with shell exports (one per line)
Example:
const { buildEnv, toShellExports } = require('claude-flow');
const env = buildEnv('openrouter', { apiKey: 'sk-or-v1-...' });
const shellScript = toShellExports(env);
console.log(shellScript);
// export ANTHROPIC_BASE_URL='https://openrouter.ai/api'
// export ANTHROPIC_AUTH_TOKEN='sk-or-v1-...'
// export ANTHROPIC_API_KEY=''
// export ANTHROPIC_DEFAULT_HAIKU_MODEL='anthropic/claude-haiku-4-5-20251001'
// export ANTHROPIC_DEFAULT_SONNET_MODEL='anthropic/claude-sonnet-4-6'
// export ANTHROPIC_DEFAULT_OPUS_MODEL='anthropic/claude-opus-4-6'Get provider configuration by name.
Arguments:
name(string): Provider name or alias (openrouter,or,deepseek,ds,openai,gpt,gemini,custom)
Returns: Provider config object, or null if unknown
Example:
const { getProvider } = require('claude-flow');
const provider = getProvider('openrouter');
console.log(provider);
// {
// name: 'OpenRouter',
// description: '200+ models from every major provider...',
// envKey: 'OPENROUTER_API_KEY',
// baseUrl: 'https://openrouter.ai/api',
// auth: 'auth_token',
// models: { haiku: '...', sonnet: '...', opus: '...' },
// docsUrl: 'https://openrouter.ai/keys',
// popularModels: [...]
// }List all provider names (excludes aliases).
Returns: Array of provider names
Example:
const { listProviders } = require('claude-flow');
console.log(listProviders());
// ['openrouter', 'deepseek', 'openai', 'gemini', 'custom']Direct access to the provider configurations object.
Example:
const { PROVIDERS } = require('claude-flow');
console.log(PROVIDERS.openrouter.docsUrl);
// 'https://openrouter.ai/keys'
console.log(PROVIDERS.openrouter.popularModels);
// [{ id: 'anthropic/claude-sonnet-4-6', name: '...', tier: 'sonnet' }, ...]Config file management (for integration with other tools).
Methods:
config.load()— Load entire config from diskconfig.save(cfg)— Save config to diskconfig.getActiveProvider()— Get active provider nameconfig.setActiveProvider(name)— Set active providerconfig.getProviderConfig(name)— Get settings for a providerconfig.setProviderConfig(name, settings)— Save settings for a providerconfig.getConfigPath()— Get config file path
Example:
const { config } = require('claude-flow');
const activeProvider = config.getActiveProvider();
const settings = config.getProviderConfig(activeProvider);
console.log(settings.apiKey); // The saved API key
console.log(settings.models); // { haiku: '...', sonnet: '...', opus: '...' }Claude Code Studio uses claude-flow to let you choose LLM providers in the UI.
const { buildEnv, PROVIDERS } = require('claude-flow');
// In your app, let user pick a provider
const selectedProvider = 'openrouter';
const apiKey = user.apiKeys[selectedProvider];
// Build env vars
const env = buildEnv(selectedProvider, { apiKey });
// Pass to Claude Code subprocess
spawn('claude', args, { env: mergeEnv(selectedProvider, { apiKey }) });Use claude-flow in your Docker container to route to any LLM:
FROM node:18-slim
RUN npm install -g github:Lexus2016/claude-flow
WORKDIR /app
COPY . .
# Run Claude Code with provider from env
CMD ["sh", "-c", "eval $(claude-flow env) && claude -p 'Your prompt'"]Build and run:
docker build -t my-app .
docker run --env OPENROUTER_API_KEY='sk-or-v1-...' my-appOr use docker-compose.yml:
version: '3'
services:
app:
build: .
environment:
OPENROUTER_API_KEY: ${OPENROUTER_API_KEY}
ANTHROPIC_DEFAULT_SONNET_MODEL: deepseek/deepseek-chatAdd to your ~/.bashrc, ~/.zshrc, or ~/.fish/config.fish:
# Activate Claude Code provider
eval "$(claude-flow env)"Now Claude Code will use your configured provider for every invocation in new shells.
Reload your shell:
source ~/.bashrc # or ~/.zshrc, etc.Make sure claude-flow is installed and in your PATH:
which claude-flow
# If empty, install from GitHub:
npm install -g github:Lexus2016/claude-flowYou haven't run setup yet:
claude-flow setup openrouter
# or another providerThe provider is configured but the API key wasn't saved. Run setup again:
claude-flow setup openrouterOr manually add the key to ~/.claude-flow/config.json.
Make sure you've:
- Run
claude-flow setup <provider> - Either:
- Run
eval $(claude-flow env)in your current shell - Or use
claude-flow run -- claude ...to run Claude Code once - Or add
eval "$(claude-flow env)"to your shell profile and restart
- Run
Check your active provider:
claude-flow statusThis is a warning, not an error. It means your API key doesn't match the expected format for that provider. Double-check it's the right key:
- OpenRouter keys start with
sk-or- - DeepSeek keys start with
sk- - OpenAI keys start with
sk- - Gemini keys start with
AI
If you're sure it's correct, you can ignore the warning.
For custom endpoints, make sure:
- The base URL includes
/v1or the full path to the API (e.g.,https://your-proxy.local/v1) - The model IDs are valid for your endpoint
- Your API key is correct
Run:
claude-flow statusAnd verify all settings.
If Claude Code errors with "invalid API key" or similar, it might be using the wrong auth method. Check what you configured:
claude-flow envLook at the output:
- If you see
ANTHROPIC_AUTH_TOKEN= (not empty) andANTHROPIC_API_KEY=''(empty quotes) → You're using a proxy provider correctly ✓ - If you see
ANTHROPIC_API_KEY= (your key) and noANTHROPIC_AUTH_TOKEN→ You're using a direct provider ✓ - If you see both with values → There's a conflict, reconfigure
Run the test suite:
npm testTests cover:
- Provider configuration loading
- Environment variable building for each provider
- The critical "empty string" behavior for proxy providers
- Shell export formatting
- Custom model overrides
- Error handling
All tests pass with zero external dependencies.
- Fork the repo
- Create a branch for your feature
- Write tests for new functionality
- Submit a pull request
- Add to
PROVIDERSinlib/providers.js - Include
name,description,envKey,baseUrl,authtype, defaultmodels, anddocsUrl - Add test cases in
test/test.js - Update this README with the provider section
MIT — See LICENSE file
- Claude Code CLI — The CLI we're configuring
- Claude Code Studio — IDE that uses claude-flow
- OpenRouter — Access 200+ models
- DeepSeek Platform — Ultra-cheap reasoning models
- GitHub Issues: Report bugs or request features at https://github.com/Lexus2016/claude-flow/issues
- Discussions: https://github.com/Lexus2016/claude-flow/discussions
- Author: CDZV — Code Zero Digital Visual Trading
Made with love for developers who want to use any LLM with Claude Code.