An MCP (Model Context Protocol) server that provides unified access to multiple Large Language Model APIs for AI coding environments like Cursor and Claude Desktop. - ๐ 8 LLM Providers โ ChatGPT, Claude, DeepSeek, Gemini, Grok, Kimi, Perplexity, Mistral - ๐ฏ Smart Model Selection โ Tag-based preferences (coding, business, reasoning, math, creative, general) - ๐ Prompt Logging โ Track all prompts
Add this skill
npx mdskills install JamesANZ/cross-llm-mcpComprehensive multi-LLM integration with smart model selection, excellent docs, but needs shell execution justification
Access multiple LLM APIs from one place. Call ChatGPT, Claude, DeepSeek, Gemini, Grok, Kimi, Perplexity, and Mistral with intelligent model selection, preferences, and prompt logging.
An MCP (Model Context Protocol) server that provides unified access to multiple Large Language Model APIs for AI coding environments like Cursor and Claude Desktop.
Ready to access multiple LLMs? Install in seconds:
Install in Cursor (Recommended):
Or install manually:
npm install -g cross-llm-mcp
# Or from source:
git clone https://github.com/JamesANZ/cross-llm-mcp.git
cd cross-llm-mcp && npm install && npm run build
call-chatgpt โ OpenAI's ChatGPT APIcall-claude โ Anthropic's Claude APIcall-deepseek โ DeepSeek APIcall-gemini โ Google's Gemini APIcall-grok โ xAI's Grok APIcall-kimi โ Moonshot AI's Kimi APIcall-perplexity โ Perplexity AI APIcall-mistral โ Mistral AI APIcall-all-llms โ Call all LLMs with the same promptcall-llm โ Call a specific provider by nameget-user-preferences โ Get current preferencesset-user-preferences โ Set default model, cost preference, and tag-based preferencesget-models-by-tag โ Find models by tag (coding, business, reasoning, math, creative, general)get-prompt-history โ View prompt history with filtersget-prompt-stats โ Get statistics about prompt logsdelete-prompt-entries โ Delete log entries by criteriaclear-prompt-history โ Clear all prompt logsClick the install link above or use:
cursor://anysphere.cursor-deeplink/mcp/install?name=cross-llm-mcp&config=eyJjcm9zcy1sbG0tbWNwIjp7ImNvbW1hbmQiOiJucHgiLCJhcmdzIjpbIi15IiwiY3Jvc3MtbGxtLW1jcCJdfX0=
After installation, add your API keys in Cursor settings (see Configuration below).
Requirements: Node.js 18+ and npm
# Clone and build
git clone https://github.com/JamesANZ/cross-llm-mcp.git
cd cross-llm-mcp
npm install
npm run build
Add to claude_desktop_config.json:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"cross-llm-mcp": {
"command": "node",
"args": ["/absolute/path/to/cross-llm-mcp/build/index.js"],
"env": {
"OPENAI_API_KEY": "your_openai_api_key_here",
"ANTHROPIC_API_KEY": "your_anthropic_api_key_here",
"DEEPSEEK_API_KEY": "your_deepseek_api_key_here",
"GEMINI_API_KEY": "your_gemini_api_key_here",
"XAI_API_KEY": "your_grok_api_key_here",
"KIMI_API_KEY": "your_kimi_api_key_here",
"PERPLEXITY_API_KEY": "your_perplexity_api_key_here",
"MISTRAL_API_KEY": "your_mistral_api_key_here"
}
}
}
}
Restart Claude Desktop after configuration.
Set environment variables for the LLM providers you want to use:
export OPENAI_API_KEY="your_openai_api_key"
export ANTHROPIC_API_KEY="your_anthropic_api_key"
export DEEPSEEK_API_KEY="your_deepseek_api_key"
export GEMINI_API_KEY="your_gemini_api_key"
export XAI_API_KEY="your_grok_api_key"
export KIMI_API_KEY="your_kimi_api_key"
export PERPLEXITY_API_KEY="your_perplexity_api_key"
export MISTRAL_API_KEY="your_mistral_api_key"
Get a response from OpenAI:
{
"tool": "call-chatgpt",
"arguments": {
"prompt": "Explain quantum computing in simple terms",
"temperature": 0.7,
"max_tokens": 500
}
}
Get responses from all providers:
{
"tool": "call-all-llms",
"arguments": {
"prompt": "Write a short poem about AI",
"temperature": 0.8
}
}
Automatically use the best model for each task type:
{
"tool": "set-user-preferences",
"arguments": {
"defaultModel": "gpt-4o",
"costPreference": "cheaper",
"tagPreferences": {
"coding": "deepseek-r1",
"general": "gpt-4o",
"business": "claude-3.5-sonnet-20241022",
"reasoning": "deepseek-r1",
"math": "deepseek-r1",
"creative": "gpt-4o"
}
}
}
View your prompt logs:
{
"tool": "get-prompt-history",
"arguments": {
"provider": "chatgpt",
"limit": 10
}
}
Models are tagged by their strengths:
deepseek-r1, deepseek-coder, gpt-4o, claude-3.5-sonnet-20241022claude-3-opus-20240229, gpt-4o, gemini-1.5-prodeepseek-r1, o1-preview, claude-3.5-sonnet-20241022deepseek-r1, o1-preview, o1-minigpt-4o, claude-3-opus-20240229, gemini-1.5-progpt-4o-mini, claude-3-haiku-20240307, gemini-1.5-flashBuilt with: Node.js, TypeScript, MCP SDK
Dependencies: @modelcontextprotocol/sdk, superagent, zod
Platforms: macOS, Windows, Linux
Preference Storage:
~/.cross-llm-mcp/preferences.json%APPDATA%/cross-llm-mcp/preferences.jsonPrompt Log Storage:
~/.cross-llm-mcp/prompts.json%APPDATA%/cross-llm-mcp/prompts.jsonโญ If this project helps you, please star it on GitHub! โญ
Contributions welcome! Please open an issue or submit a pull request.
MIT License โ see LICENSE.md for details.
If you find this project useful, consider supporting it:
โก Lightning Network
lnbc1pjhhsqepp5mjgwnvg0z53shm22hfe9us289lnaqkwv8rn2s0rtekg5vvj56xnqdqqcqzzsxqyz5vqsp5gu6vh9hyp94c7t3tkpqrp2r059t4vrw7ps78a4n0a2u52678c7yq9qyyssq7zcferywka50wcy75skjfrdrk930cuyx24rg55cwfuzxs49rc9c53mpz6zug5y2544pt8y9jflnq0ltlha26ed846jh0y7n4gm8jd3qqaautqa
โฟ Bitcoin: bc1ptzvr93pn959xq4et6sqzpfnkk2args22ewv5u2th4ps7hshfaqrshe0xtp
ฮ Ethereum/EVM: 0x42ea529282DDE0AA87B42d9E83316eb23FE62c3f
Install via CLI
npx mdskills install JamesANZ/cross-llm-mcpCross-LLM MCP Server is a free, open-source AI agent skill. An MCP (Model Context Protocol) server that provides unified access to multiple Large Language Model APIs for AI coding environments like Cursor and Claude Desktop. - ๐ 8 LLM Providers โ ChatGPT, Claude, DeepSeek, Gemini, Grok, Kimi, Perplexity, Mistral - ๐ฏ Smart Model Selection โ Tag-based preferences (coding, business, reasoning, math, creative, general) - ๐ Prompt Logging โ Track all prompts
Install Cross-LLM MCP Server with a single command:
npx mdskills install JamesANZ/cross-llm-mcpThis downloads the skill files into your project and your AI agent picks them up automatically.
Cross-LLM MCP Server works with Claude Code, Claude Desktop, Cursor, Vscode Copilot, Windsurf, Continue Dev, Gemini Cli, Amp, Roo Code, Goose. Skills use the open SKILL.md format which is compatible with any AI coding agent that reads markdown instructions.