MCP server connecting Claude/Cursor to Codex CLI. Enables code analysis via @ file references, multi-turn conversations, sandboxed edits, and structured change mode. - File Analysis — Reference files with @src/, @package.json syntax - Multi-Turn Sessions — Conversation continuity with workspace isolation - Native Resume — Uses codex resume for context preservation (CLI v0.36.0+) - Local OSS Models
Add this skill
npx mdskills install x51xxx/codex-mcp-toolWell-documented MCP server bridging Claude to Codex CLI with multi-turn sessions and local model support
MCP server connecting Claude/Cursor to Codex CLI. Enables code analysis via @ file references, multi-turn conversations, sandboxed edits, and structured change mode.
@src/, @package.json syntaxcodex resume for context preservation (CLI v0.36.0+)localProvidersearch: true--full-autoclaude mcp add codex-cli -- npx -y @trishchuk/codex-mcp-tool
Prerequisites: Node.js 18+, Codex CLI installed and authenticated.
{
"mcpServers": {
"codex-cli": {
"command": "npx",
"args": ["-y", "@trishchuk/codex-mcp-tool"]
}
}
}
Config locations: macOS: ~/Library/Application Support/Claude/claude_desktop_config.json | Windows: %APPDATA%\Claude\claude_desktop_config.json
// File analysis
'explain the architecture of @src/';
'analyze @package.json and list dependencies';
// With specific model
'use codex with model gpt-5.3-codex to analyze @algorithm.py';
// Multi-turn conversations (v1.4.0+)
'ask codex sessionId:"my-project" prompt:"explain @src/"';
'ask codex sessionId:"my-project" prompt:"now add error handling"';
// Brainstorming
'brainstorm ways to optimize CI/CD using SCAMPER method';
// Sandbox mode
'use codex sandbox:true to create and run a Python script';
// Web search
'ask codex search:true prompt:"latest TypeScript 5.7 features"';
// Local OSS model (Ollama)
'ask codex localProvider:"ollama" model:"qwen3:8b" prompt:"explain @src/"';
| Tool | Description |
|---|---|
ask-codex | Execute Codex CLI with file analysis, models, sessions |
brainstorm | Generate ideas with SCAMPER, design-thinking, etc. |
list-sessions | View/delete/clear conversation sessions |
health | Diagnose CLI installation, version, features |
ping / help | Test connection, show CLI help |
Default: gpt-5.3-codex with fallback → gpt-5.2-codex → gpt-5.1-codex-max → gpt-5.2
| Model | Use Case |
|---|---|
gpt-5.3-codex | Latest frontier agentic coding (default) |
gpt-5.2-codex | Frontier agentic coding |
gpt-5.1-codex-max | Deep and fast reasoning |
gpt-5.1-codex-mini | Cost-efficient quick tasks |
gpt-5.2 | Broad knowledge, reasoning and coding |
Multi-turn conversations with workspace isolation:
{ "prompt": "analyze code", "sessionId": "my-session" }
{ "prompt": "continue from here", "sessionId": "my-session" }
{ "prompt": "start fresh", "sessionId": "my-session", "resetSession": true }
Environment:
CODEX_SESSION_TTL_MS - Session TTL (default: 24h)CODEX_MAX_SESSIONS - Max sessions (default: 50)Run with local Ollama or LM Studio instead of OpenAI:
// Ollama
{ "prompt": "analyze @src/", "localProvider": "ollama", "model": "qwen3:8b" }
// LM Studio
{ "prompt": "analyze @src/", "localProvider": "lmstudio", "model": "my-model" }
// Auto-select provider
{ "prompt": "analyze @src/", "oss": true }
Requirements: Ollama running locally with a model that supports tool calling (e.g. qwen3:8b).
| Parameter | Description |
|---|---|
model | Model selection |
sessionId | Enable conversation continuity |
sandbox | Enable --full-auto mode |
search | Enable web search |
changeMode | Structured OLD/NEW edits |
addDirs | Additional writable directories |
toolOutputTokenLimit | Cap response verbosity (100-10,000) |
reasoningEffort | Reasoning depth: low, medium, high, xhigh |
oss | Use local OSS model provider |
localProvider | Local provider: lmstudio or ollama |
| Version | Features |
|---|---|
| v0.60.0+ | GPT-5.2 model family |
| v0.59.0+ | --add-dir, token limits |
| v0.52.0+ | Native --search flag |
| v0.36.0+ | Native codex resume (sessions) |
codex --version # Check CLI version
codex login # Authenticate
Use health tool for diagnostics: 'use health verbose:true'
v1.5.x → v1.6.0: Local OSS model support (localProvider, oss), gpt-5.3-codex default model, xhigh reasoning effort.
v1.3.x → v1.4.0: New sessionId parameter, list-sessions/health tools, structured error handling. No breaking changes.
MIT License. Not affiliated with OpenAI.
Documentation | Issues | Inspired by jamubc/gemini-mcp-tool
Install via CLI
npx mdskills install x51xxx/codex-mcp-toolCodex MCP Tool is a free, open-source AI agent skill. MCP server connecting Claude/Cursor to Codex CLI. Enables code analysis via @ file references, multi-turn conversations, sandboxed edits, and structured change mode. - File Analysis — Reference files with @src/, @package.json syntax - Multi-Turn Sessions — Conversation continuity with workspace isolation - Native Resume — Uses codex resume for context preservation (CLI v0.36.0+) - Local OSS Models
Install Codex MCP Tool with a single command:
npx mdskills install x51xxx/codex-mcp-toolThis downloads the skill files into your project and your AI agent picks them up automatically.
Codex MCP Tool works with Claude Code, Claude Desktop, Cursor, Vscode Copilot, Windsurf, Continue Dev, Codex, Gemini Cli, Amp, Roo Code, Goose, Opencode, Trae, Qodo, Command Code. Skills use the open SKILL.md format which is compatible with any AI coding agent that reads markdown instructions.