An MCP (Model Context Protocol) server that exposes a collection of system prompts, summaries, and tool definitions from popular AI tools as MCP tools for AI coding environments like Cursor and Claude Desktop. - ๐ Automatic Discovery โ Every prompt in prompts/ is automatically exposed as an MCP tool - ๐ฏ Model-Aware Suggestions โ Get prompt recommendations based on your LLM (Claude, GPT, Gemini,
Add this skill
npx mdskills install JamesANZ/system-prompts-mcp-serverWell-documented MCP server for accessing AI tool prompts with automatic discovery and model-aware suggestions
Access system prompts from AI tools in your workflow. Browse and fetch prompts from Devin, Cursor, Claude, GPT, and more. Model-aware suggestions help you find the perfect prompt for your LLM.
An MCP (Model Context Protocol) server that exposes a collection of system prompts, summaries, and tool definitions from popular AI tools as MCP tools for AI coding environments like Cursor and Claude Desktop.
prompts/ is automatically exposed as an MCP toolReady to explore system prompts? Install in seconds:
Install in Cursor (Recommended):
Or install manually:
npm install -g system-prompts-mcp-server
# Or from source:
git clone https://github.com/JamesANZ/system-prompts-and-models-of-ai-tools.git
cd system-prompts-and-models-of-ai-tools && npm install && npm run build
list_prompts โ Browse available prompts with filters (service, flavor, provider)get_prompt_suggestion โ Get ranked prompt suggestions for your LLM and keywords-- โ Direct access to any prompt (e.g., cursor-agent-system, devin-summary)prompts/ directory for .txt, .md, .yaml, .yml, .json filesClick the install link above or use:
cursor://anysphere.cursor-deeplink/mcp/install?name=system-prompts-mcp&config=eyJzeXN0ZW0tcHJvbXB0cy1tY3AiOnsiY29tbWFuZCI6Im5weCIsImFyZ3MiOlsiLXkiLCJzeXN0ZW0tcHJvbXB0cy1tY3Atc2VydmVyIl19fQ==
Requirements: Node.js 18+ and npm
# Clone and build
git clone https://github.com/JamesANZ/system-prompts-and-models-of-ai-tools.git
cd system-prompts-and-models-of-ai-tools
npm install
npm run build
# Run server
npm start
Add to claude_desktop_config.json:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"system-prompts-mcp": {
"command": "node",
"args": ["/absolute/path/to/system-prompts-and-models-of-ai-tools/dist/index.js"],
"env": {
"PROMPT_LIBRARY_ROOT": "/absolute/path/to/system-prompts-and-models-of-ai-tools/prompts"
}
}
}
}
Restart Claude Desktop after configuration.
Browse prompts with optional filters:
{
"tool": "list_prompts",
"arguments": {
"service": "cursor",
"flavor": "system",
"limit": 10
}
}
Find the best prompt for your LLM and use case:
{
"tool": "get_prompt_suggestion",
"arguments": {
"userLlm": "claude-3.5-sonnet",
"keywords": ["code", "pair programming"]
}
}
Call a prompt directly by its tool name:
{
"tool": "cursor-agent-system",
"arguments": {}
}
Get structured metadata only:
{
"tool": "cursor-agent-system",
"arguments": {
"format": "json"
}
}
Add prompts by placing files in the prompts/ directory:
Supported formats: .txt, .md, .yaml, .yml, .json
Directory structure:
prompts/My Service/
โโโ System Prompt.txt โ Tool: "my-service-system-prompt-system"
โโโ tools.json โ Tool: "my-service-tools-tools"
After adding prompts, restart the MCP server. Use list_prompts to find your custom prompts.
Custom directory: Set PROMPT_LIBRARY_ROOT environment variable to use a different location.
Built with: Node.js, TypeScript, MCP SDK
Dependencies: @modelcontextprotocol/sdk, zod
Platforms: macOS, Windows, Linux
Environment Variables:
PROMPT_LIBRARY_ROOT (optional): Override prompt root directory (defaults to prompts/)Project Structure:
src/ โ TypeScript MCP server implementationdist/ โ Compiled JavaScriptprompts/ โ Prompt library and original documentationโญ If this project helps you, please star it on GitHub! โญ
Contributions welcome! Feel free to adapt the discovery logic, add tests, or extend metadata inference for new prompt formats.
See the original repository for license information.
If you find this project useful, consider supporting it:
โก Lightning Network
lnbc1pjhhsqepp5mjgwnvg0z53shm22hfe9us289lnaqkwv8rn2s0rtekg5vvj56xnqdqqcqzzsxqyz5vqsp5gu6vh9hyp94c7t3tkpqrp2r059t4vrw7ps78a4n0a2u52678c7yq9qyyssq7zcferywka50wcy75skjfrdrk930cuyx24rg55cwfuzxs49rc9c53mpz6zug5y2544pt8y9jflnq0ltlha26ed846jh0y7n4gm8jd3qqaautqa
โฟ Bitcoin: bc1ptzvr93pn959xq4et6sqzpfnkk2args22ewv5u2th4ps7hshfaqrshe0xtp
ฮ Ethereum/EVM: 0x42ea529282DDE0AA87B42d9E83316eb23FE62c3f
Install via CLI
npx mdskills install JamesANZ/system-prompts-mcp-serverSystem Prompts MCP Server is a free, open-source AI agent skill. An MCP (Model Context Protocol) server that exposes a collection of system prompts, summaries, and tool definitions from popular AI tools as MCP tools for AI coding environments like Cursor and Claude Desktop. - ๐ Automatic Discovery โ Every prompt in prompts/ is automatically exposed as an MCP tool - ๐ฏ Model-Aware Suggestions โ Get prompt recommendations based on your LLM (Claude, GPT, Gemini,
Install System Prompts MCP Server with a single command:
npx mdskills install JamesANZ/system-prompts-mcp-serverThis downloads the skill files into your project and your AI agent picks them up automatically.
System Prompts MCP Server works with Claude Code, Claude Desktop, Cursor, Vscode Copilot, Windsurf, Continue Dev, Gemini Cli, Amp, Roo Code, Goose. Skills use the open SKILL.md format which is compatible with any AI coding agent that reads markdown instructions.