After setup, you can usually just say: The goal is that you only install this MCP server, and it handles the rest (searching servers, selecting servers, configuring servers, etc.). Choose one of the following: 1. Remote (simplest & fastest โก๐จ) 2. Local (prebuilt) โ Docker, uvx, or npx 3. Local (from source) โ run this repo directly Use the hosted endpoint (recommended for the simplest setup). Doc
Add this skill
npx mdskills install particlefuture/1mcpserverWell-documented meta-MCP server that discovers and configures other MCP servers with flexible deployment options
MCP of MCPs โ automatically discover and configure MCP servers on your machine (remote or local).
After setup, you can usually just say:
โI want to perform . Call the
deep_searchtool and follow the outlined steps.โ
The goal is that you only install this MCP server, and it handles the rest (searching servers, selecting servers, configuring servers, etc.).
Choose one of the following:
Use the hosted endpoint (recommended for the simplest setup).
Docs + guided setup: https://mcp.1mcpserver.com/
Add the following entry to your client config file:
./.cursor/mcp.json./gemini/settings.json (see Gemini docs)~/Library/Application Support/Claude/claude_desktop_config.json%APPDATA%\Claude\claude_desktop_config.json~/.codex/config.toml%USERPROFILE%\.codex\config.tomlRemote config (JSON):
{
"mcpServers": {
"1mcpserver": {
"url": "https://mcp.1mcpserver.com/mcp/",
"headers": {
"Accept": "text/event-stream",
"Cache-Control": "no-cache"
}
}
}
}
If you already have other servers configured, just merge this entry under mcpServers For example:
{
"mcpServers": {
"1mcpserver": {
"url": "https://mcp.1mcpserver.com/mcp/",
"headers": {
"Accept": "text/event-stream",
"Cache-Control": "no-cache"
}
},
"file-system": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "."]
}
}
}
Tip: If your client supports it, move the config file into your home directory to apply globally.
Use this when you want everything local, or when your MCP client only supports STDIO.
docker run -p 8080:8080 ghcr.io/particlefuture/1mcpserver:latest
Running on other host ports:
docker run -p :8080 ghcr.io/particlefuture/1mcpserver:latest
Running with stdio instead of streamable-http (You might see some delays when trying to connect):
run --rm -i ghcr.io/particlefuture/1mcpserver:latest --local
{
"mcpServers": {
"1mcpserver": {
"url": "https://mcp.1mcpserver.com/mcp/"
}
}
}
npx -y @1mcpserver/1mcpserver
Clone this repo and run directly.
git clone https://github.com/particlefuture/MCPDiscovery.git
cd MCPDiscovery
uv sync
uv run server.py --local
{
"mcpServers": {
"1mcpserver": {
"command": "/path/to/uv",
"args": [
"--directory",
"",
"run",
"server.py",
"--local"
]
}
}
}
If your client supports remote
urlservers, you can use the Remote setup instead.
If you want your LLM to have file-system access, add an MCP filesystem server and point it at the directory you want to allow:
{
"mcpServers": {
"file-system": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "~/"]
}
}
}
There are two search modes:
For explicit requests like: โI want an MCP server that handles payments.โ
Returns a shortlist of relevant MCP servers.
For higher-level or complex goals like: โBuild a website that analyzes other websites.โ
The LLM breaks the goal into components/steps, finds MCP servers for each part, and if something is missing, it asks whether to:
Deep Search stages:
test_server_template_code)internal_ prefix unless instructedData sources:
Published to:
ModuleNotFoundError even after installing: delete the venv and recreate it.Please create an issue or directly contact me zjia71@gatech.edu if you encounter ANY issue of frustration. I really hope the setup is as smooth as possible!!
Install via CLI
npx mdskills install particlefuture/1mcpserver1mcpserver is a free, open-source AI agent skill. After setup, you can usually just say: The goal is that you only install this MCP server, and it handles the rest (searching servers, selecting servers, configuring servers, etc.). Choose one of the following: 1. Remote (simplest & fastest โก๐จ) 2. Local (prebuilt) โ Docker, uvx, or npx 3. Local (from source) โ run this repo directly Use the hosted endpoint (recommended for the simplest setup). Doc
Install 1mcpserver with a single command:
npx mdskills install particlefuture/1mcpserverThis downloads the skill files into your project and your AI agent picks them up automatically.
1mcpserver works with Claude Code, Claude Desktop, Cursor, Vscode Copilot, Windsurf, Continue Dev, Codex, Gemini Cli, Amp, Roo Code, Goose, Opencode, Trae, Qodo, Command Code. Skills use the open SKILL.md format which is compatible with any AI coding agent that reads markdown instructions.