A comprehensive Model Context Protocol server that provides AI agents with advanced code generation, auditing, and debugging tools powered by OpenAI's Codex CLI. Features 10 specialized tools covering the complete software development lifecycle.
This MCP server provides AI agents with 10 comprehensive tools covering the complete software development lifecycle: planning, implementation, review, debugging, security auditing, conversational assistance, and system management. Each tool is designed to maximize Codex CLI's capabilities with rich context and structured responses.
Perfect for agents that need:
- Strategic implementation planning with context awareness
- Precise code generation with architectural guidance
- Quality code review with actionable feedback
- Intelligent debugging with root cause analysis
- Comprehensive security auditing and compliance checking
- Interactive conversational assistance and brainstorming
- Real-time session and health monitoring
- Persistent Agent Containers: Each agent gets its own long-running Codex CLI environment
- Context Preservation: Full conversation history and workspace state maintained
- Container Management: Complete lifecycle management with monitoring tools
- Real-time Collaboration: Direct communication with always-running Codex instances
- Production Ready: Docker Compose orchestration with persistent storage
- Easy Integration: Standard MCP protocol for seamless agent connectivity
graph TB
A[AI Agent 1] --> B[MCP Server Container]
C[AI Agent 2] --> B
D[AI Agent N] --> B
B --> E[Persistent Agent Manager]
E --> F[Agent 1 Codex Container<br/>Always Running]
E --> G[Agent 2 Codex Container<br/>Always Running]
E --> H[Agent N Codex Container<br/>Always Running]
B --> I[Container Management Tools]
B --> J[Persistence Manager]
J --> K[Agent Data Storage]
- Docker Engine 20.10+ with Docker Compose v2.0+
- Python 3.12+ (for development and local testing)
- Authentication Method (choose one):
- ChatGPT Subscription (recommended) - uses your existing ChatGPT quota
- OpenAI API Key - separate billing per request
- Git for cloning the repository
- Minimum System Requirements:
- 4GB RAM (8GB recommended for multiple agents)
- 2 CPU cores (4 cores recommended)
- 10GB free disk space for containers and data
-
Clone the repository
git clone https://github.com/your-org/codex-mcp-server.git cd codex-mcp-server -
Configure authentication (choose one method)
Method A: ChatGPT Subscription OAuth (Recommended)
# No configuration needed - authenticate after starting the server # This uses your ChatGPT subscription quota
Method B: OpenAI API Key
# Create .env file with your OpenAI API key echo "OPENAI_API_KEY=your-openai-api-key-here" > .env
-
Deploy the persistent agent architecture
# Start the MCP server and persistent infrastructure docker-compose --profile codex-mcp up -d -
Verify deployment
# Check services are running docker-compose ps # Check server health curl http://localhost:8210/health
With the persistent agent architecture, agents connect to the containerized MCP server instead of running local processes.
claude mcp add --transport sse codex-mcp http://localhost:8210/sse --scope user
Add this to your Claude Desktop MCP settings:
{
"mcpServers": {
"codex-mcp": {
"type": "sse",
"url": "http://localhost:8210/sse"
}
}
}npx @modelcontextprotocol/inspector http://localhost:8210# Test health endpoint
curl http://localhost:8210/health
# Test agent creation
curl -X POST http://localhost:8210/tools/codex_chat \
-H "Content-Type: application/json" \
-d '{"message": "Hello Codex", "agent_id": "test_agent"}'Recommended: Use your ChatGPT Plus subscription instead of separate OpenAI API billing!
ZERO-CONFIG OAUTH: The server automatically detects and uses existing OAuth tokens from your system - no manual configuration required!
Step 1: Authenticate with Codex CLI on your host system
# Install Codex CLI globally (if not already installed)
npm install -g @openai/codex
# Authenticate with your ChatGPT account
codex auth loginStep 2: Build and the MCP server
#Build the MCP Server
docker-compose --profile codex-mcp up -d --build Then to start the mcp service:
# The server automatically detects and uses your OAuth tokens
docker-compose --profile codex-mcp up -dStep 3: Verify OAuth is working
# Check that containers are using OAuth (not API key)
docker logs codex-mcp-server | grep -i oauth
# You should see: "Injecting OAuth tokens for ChatGPT subscription authentication"That's it! Automatic OAuth Detection works by:
- Auto-detecting OAuth tokens in
~/.codex/auth.json - Auto-mounting tokens from Windows host to Linux containers
- Auto-injecting tokens into each agent container
- Auto-switching to your ChatGPT Plus subscription
No configuration files needed! No environment variables to set! The system detects your existing Codex CLI authentication and "just works".
Check your subscription details:
# See what authentication method is active
docker exec [container-name] cat ~/.codex/auth.json | head -5
# Should show: "OPENAI_API_KEY": null and "tokens": {...}Monitor agent containers:
# Watch agent startup to confirm OAuth injection
docker logs -f [agent-container-name]
# Look for: "OAuth tokens injected successfully"If you don't have Codex CLI installed locally:
-
Use the integrated auth script
# Clone the repository first git clone https://github.com/your-org/codex-mcp-server.git cd codex-mcp-server # Run OAuth authentication python auth.py login
-
Manual token setup
# Create OAuth directory mkdir -p ~/.codex # Add your ChatGPT tokens (get from browser dev tools) echo '{"tokens": {"access_token": "your-token"}}' > ~/.codex/auth.json
The server uses intelligent auto-detection with zero configuration:
# 1. Server starts and checks for OAuth tokens automatically
docker-compose --profile codex-mcp up -d
# 2. Detection happens in this order:
# ~/.codex/auth.json (Windows/Linux/Mac)
# $USERPROFILE/.codex/auth.json (Windows)
# $HOME/.codex/auth.json (Linux/Mac)
# Environment variables (fallback)
# 3. If OAuth found: Uses ChatGPT subscription
# 4. If no OAuth: Falls back to API keySmart path detection:
- Windows:
C:\Users\YourName\.codex\auth.json - Linux/Mac:
~/.codex/auth.json - Docker: Automatically mounts from host to
/app/.codex
Override automatic detection if needed:
# Authentication preferences (usually not needed)
CODEX_AUTH_METHOD=auto # auto (default), api_key, or oauth
CODEX_PREFER_OAUTH=true # Prefer OAuth over API key (default)
# Custom OAuth directory (if tokens are elsewhere)
CODEX_AUTH_DIR=/custom/path/to/codex/dir
# OAuth client settings (for manual auth script only)
OAUTH_CLIENT_ID=codex-cli # OAuth client identifier
OAUTH_CALLBACK_PORT=8765 # Local callback server port
OAUTH_AUTO_OPEN_BROWSER=true # Auto-open browser for authWhy choose OAuth over API keys:
- Use existing ChatGPT Plus subscription (no separate billing)
- Higher rate limits compared to API keys
- More secure (tokens auto-refresh)
- Better quota management through ChatGPT interface
- Simplified billing (one subscription for everything)
OAuth not working? Check these:
-
Verify OAuth tokens exist:
# Check host system has OAuth tokens ls -la ~/.codex/auth.json cat ~/.codex/auth.json | grep -i "access_token"
-
Confirm container mounting:
# Check if tokens are mounted in main container docker exec codex-mcp-server ls -la /app/.codex/ docker exec codex-mcp-server cat /app/.codex/auth.json | head -5
-
Verify agent injection:
# Check agent containers have injected tokens docker exec [agent-container] cat ~/.codex/auth.json | head -5 # Should show OAuth tokens, not API key
Common issues and fixes:
- "No OAuth tokens found": Run
codex auth loginon host system first - "Read-only file system": Normal - tokens are injected, not mounted in agents
- "Still using API key": Check logs for OAuth injection success messages
- "ChatGPT quota exceeded": Your subscription hit limits, wait for reset
Switch back to API key if needed:
# Set API key in .env file
echo "OPENAI_API_KEY=your-api-key" > .env
echo "CODEX_PREFER_OAUTH=false" >> .env
# Restart server
docker-compose --profile codex-mcp restartConfigure the server behavior through environment variables in your .env file:
| Variable | Description | Required | Default |
|---|---|---|---|
| Authentication | |||
OPENAI_API_KEY |
OpenAI API key for Codex CLI | Conditional* | - |
CHATGPT_OAUTH_TOKEN |
ChatGPT OAuth token (if provided) | No | - |
CODEX_AUTH_METHOD |
Authentication method (auto/api_key/oauth) | No | auto |
CODEX_PREFER_OAUTH |
Prefer OAuth over API key | No | true |
| OAuth Configuration | |||
OAUTH_CLIENT_ID |
OAuth client identifier | No | codex-cli |
OAUTH_CALLBACK_PORT |
OAuth callback server port | No | 8765 |
OAUTH_CALLBACK_TIMEOUT |
OAuth timeout in seconds | No | 300 |
OAUTH_AUTO_OPEN_BROWSER |
Auto-open browser for OAuth | No | true |
OAUTH_TOKEN_STORAGE_PATH |
Custom token storage path | No | ~/.codex/auth.json |
| Codex Configuration | |||
CODEX_MODEL |
Codex model to use | No | gpt-5 |
CODEX_PROVIDER |
AI provider | No | openai |
CODEX_APPROVAL_MODE |
Approval mode (suggest/auto/manual) | No | suggest |
| Server Configuration | |||
MAX_CONCURRENT_SESSIONS |
Maximum agent sessions | No | 20 |
SESSION_TIMEOUT |
Session timeout in seconds | No | 3600 |
CONTAINER_CPU_LIMIT |
CPU limit per agent container | No | 4.0 |
CONTAINER_MEMORY_LIMIT |
Memory limit per agent container | No | 2048m |
LOG_LEVEL |
Logging level | No | INFO |
PERSISTENT_MODE |
Enable persistent agent containers | No | true |
*Required if not using OAuth authentication
The system creates persistent volumes for:
- Agent Data:
/data/agents/- Individual agent workspaces - Server Config:
/config/- Server configuration files - Session Data:
/data/sessions/- Session metadata
The server provides 10 comprehensive MCP tools for AI agents:
-
plan(task, repo_context?, constraints?)→PlanResponseCreate structured implementation plans with file targets and architectural decisions -
implement(task, target_files, context_files?, requirements?)→ImplementResponseGenerate precise code changes with diffs, explanations, and integration notes -
review(content, rubric?, focus_areas?)→ReviewResponseAnalyze code quality with inline comments, scores, and actionable recommendations -
fix(failing_tests?, error_output?, context_files?, symptoms?)→FixResponseDebug issues with targeted fixes, root cause analysis, and prevention strategies
-
chat(message, context?, previous_messages?, reference_files?)→ChatResponseInteractive conversational AI assistance for brainstorming, second opinions, and guidance -
audit(code, file_paths?, focus_areas?, severity_threshold?, compliance_standards?)→AuditResponseComprehensive security and quality auditing with CWE/CVE references and compliance checking -
debug(error_message, code_context?, stack_trace?, environment_info?, reproduction_steps?)→DebugResponseIntelligent debugging assistance with root cause analysis and multiple solution approaches
-
list_sessions(agent_id?)→DictList active Codex CLI sessions with status and metadata -
get_my_session_info()→DictGet current session details including container ID and activity status
health_check()→HealthCheckResponseServer health status, uptime, active sessions, and system information
- Container-per-agent: Complete isolation between agent containers
- Non-root execution: All containers run as unprivileged users
- Resource limits: CPU and memory quotas prevent resource exhaustion
- Network isolation: Containers communicate only through MCP server
- Runtime injection: No secrets baked into container images
- Read-only mounts: Configuration files mounted as read-only
- Workspace isolation: Separate working directories per agent
- Persistent storage: Secure volume mounting with proper permissions
- Agent boundaries: Agents cannot access other agent containers
- File system isolation: Separate persistent workspaces per agent
- Authentication: OpenAI API key required for all operations
# Test persistent architecture
python test_persistent_simple.py
# Check container status
docker-compose ps
# View logs
docker-compose logs codex-mcp-server# Install development dependencies (if modifying source)
pip install -r requirements-dev.txt
# Rebuild containers after changes
docker-compose --profile codex-mcp build --no-cache
# Restart services
docker-compose --profile codex-mcp restart- Server health:
http://localhost:8210/health- Server status - Agent status: Use
get_agent_statustool for individual agents - Container metrics: Real-time CPU and memory tracking per agent
- Container logs:
docker-compose logs codex-mcp-server - Agent logs: Individual agent container logs available
- Structured logging: JSON-formatted logs with correlation IDs
- Active agents:
list_active_agents- Show all running agents - Resource usage: Agent-specific CPU and memory metrics
- Cleanup:
cleanup_inactive_agents- Remove stale agents
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Follow the development workflow in the attached guide
- Ensure tests pass and coverage is maintained
- Submit a pull request
- Keep files under 500 lines (split into modules as needed)
- Write tests for all new features
- Follow PEP8 style guidelines
- Use type hints throughout
- Document all functions with Google-style docstrings
Container startup fails
# Check Docker daemon
docker info
# Verify services
docker-compose ps
# Check logs
docker-compose logs codex-mcp-serverAuthentication errors
# Verify API key in .env file
cat .env
# Test server health
curl http://localhost:8210/healthAgent container issues
# List active agents
curl http://localhost:8210/tools/list_active_agents
# Check specific agent
curl http://localhost:8210/tools/get_agent_status \
-d '{"agent_id": "your_agent_id"}'
# Manual cleanup
curl http://localhost:8210/tools/cleanup_inactive_agentsPort conflicts
# Check what's using port 8210
netstat -ano | findstr :8210
# Change port in docker-compose.yml if needed
# Then restart: docker-compose --profile codex-mcp up -dThis project is licensed under the MIT License - see the LICENSE file for details.
- OpenAI Codex CLI - The underlying CLI tool
- FastMCP - MCP server framework
- Model Context Protocol - The protocol specification
- Codex CLI - Official OpenAI Codex CLI
- MCP Servers - Collection of MCP servers
- FastMCP Examples - Additional MCP server examples
Built for the AI agent ecosystem