MCP (Model Context Protocol) servers let AI agents reach beyond the codebase. They connect your agent to external APIs, databases, search engines, and services through a standardized protocol — so the agent can actually take action, not just write code.
957 servers
MCP server for Google Keep 1. Add the MCP server to your MCP servers: 2. Add your credentials: GOOGLEEMAIL: Your Google account email address GOOGLEMASTERTOKEN: Your Google account master token Check https://gkeepapi.readthedocs.io/en/latest/obtaining-a-master-token and https://github.com/simon-weber/gpsoauth?tab=readme-ov-filealternative-flow for more information. find: Search notes with optional
mcp-difyworkflow-server is an mcp server Tools application that implements the query and invocation of Dify workflows, supporting the on-demand operation of multiple custom Dify workflows. - "base-url":"http://localhost/v1" The base URL of the Dify platform api server url. - "command":"mcp-difyworkflow-server" You can specify the absolute path for the compiled binary, or create a symbolic link wit
A modern, cloud-based version of the Make MCP Server is now available. For most use cases, we recommend using this new version. A Model Context Protocol server that enables Make scenarios to be utilized as tools by AI assistants. This integration allows AI systems to trigger and interact with your Make automation workflows. The MCP server: - Connects to your Make account and identifies all scenari
An experimental and educational Ping-Pong server demonstrating MCP (Model Context Protocol) calls via FastAPI. - ✅ FastAPI/FastMCP backend for remote MCP calls through API endpoints or SSE - 🔄 MCP integration for command handling - 🔐 Thread-safe session management To install the dependencies, run: Start the FastAPI server with: The server will start at http://localhost:8080. Open mcp-api-client.
A Model Context Protocol (MCP) server implementation that provides the LLM an interface for visualizing data using Vega-Lite syntax. The server offers two core tools: - Save a table of data agregations to the server for later visualization - name (string): Name of the data table to be saved - data (array): Array of objects representing the data table - Returns: success message - visualizedata - Vi
A Model Context Protocol server for calculating. This server enables LLMs to use calculator for precise numerical calculations. - calculate - Calculates/evaluates the given expression. - expression (string, required): Expression to be calculated When using uv no specific installation is needed. We will use uvx to directly run mcp-server-calculator. Alternatively you can install mcp-server-calculat
Keep an eye on your API usage. Add the following to your Claude Desktop config file: macOS: ~/Library/Application Support/Claude/claudedesktopconfig.json Windows: %APPDATA%\Claude\claudedesktopconfig.json For local development, use the path to your local repository: - For security reasons, it's best to keep versions pinned and manually update them. All tools have been implemented and tested ✅ - ✅
This is a community driven server! Contentful has released an official server which you can find here An MCP server implementation that integrates with Contentful's Content Management API, providing comprehensive content management capabilities. - Please note \; if you are not interested in the code, and just want to use this MCP in Claude Desktop (or any other tool that is able to use MCP servers
Model Context Protocol (MCP) server integrating with the Miro platform. It enables AI assistants (like Claude) to access Miro boards and manage their content through a standardized interface. - Node.js v16 or newer installed - Miro account with API token 1. Go to the Miro Developer Portal 2. Create a new app or use an existing one 3. Make sure to create token with permission selected below 4. Gene
MCP server for interacting with RabbitMQ
English Version 一个用于与语雀 API 集成的 Model-Context-Protocol (MCP) 服务器。此实现受 Figma-Context-MCP 的启发,并使用 语雀开放 API。 该服务器提供了与语雀知识库平台交互的 MCP 工具,允许 AI 模型: - 获取用户和文档信息 - 创建、读取、更新和删除文档 - 获取统计数据和分析信息 要使用 Smithery 将 Yuque MCP Server 自动安装到 Claude 桌面端: - Node.js 18+ (推荐) - 拥有 API 令牌的语雀账号 3. 基于 .env.example 创建 .env 文件: 4. (可选) 在 .env 文件中添加你的语雀 API 令牌: 你也可以选择在连接到服务器时通过查询参数提供令牌,而不是在 .env 文件中设置。 然后在 HTTP 或 CLI 模式下运行: 本项
A Model Context Protocol (MCP) server implementation that enables AI assistants to search and reference Kibela content. This setup allows AI models like Claude to securely access information stored in Kibela. The mcp-kibela server provides the following features: - Note Search: Search Kibela notes by keywords - My Notes: Fetch your latest notes - Note Content: Get note content and comments by ID -
Typst MCP Server is an MCP (Model Context Protocol) implementation that helps AI models interact with Typst, a markup-based typesetting system. The server provides tools for converting between LaTeX and Typst, validating Typst syntax, and generating images from Typst code. The server provides the following tools: 1. listdocschapters(): Lists all chapters in the Typst documentation. - Lets the LLM
A Model Context Protocol (MCP) server that enables LLMs to interact with Plane.so, allowing them to manage projects and issues through Plane's API. Using this server, LLMs like Claude can directly interact with your project management workflows while maintaining user control and security. - List all projects in your Plane workspace - Get detailed information about specific projects - Create new is
MCP server implementation for Kibela API integration, enabling LLMs to interact with Kibela content. - Search notes with advanced filters - Get your latest notes - Get note content and comments - Manage groups and folders - Like/unlike notes - List users - View note attachments - View recently viewed notes - Get notes by path - KIBELATEAM: Your Kibela team name (required) - KIBELATOKEN: Your Kibel
A Model Context Protocol (MCP) server that enables AI assistants to get human input when needed. This tool creates tasks on Amazon Mechanical Turk that let real humans answer questions from AI systems. While primarily a proof-of-concept, it demonstrates how to build human-in-the-loop AI systems using the MCP standard. See limitations for current constraints. - Node.js 16+ - AWS credentials with MT
Time MCP Server A Model Context Protocol server that enables AI assistants to interact with time The Time MCP Server is a Model Context Protocol (MCP) server that provides AI assistants and other MCP clients with standardized tools to perform time and date-related operations. This server acts as a bridge between AI tools and a robust time-handling backend, allowing for complex time manipulations t
Talk to Jira This is a TypeScript-based MCP server that provides tools to interact with Jira. It demonstrates core MCP concepts by providing: - Tools for executing JQL queries - Tools for creating, editing, and deleting Jira tickets - Tools for listing Jira projects and statuses - Purpose: Run a JQL query. - Parameters: jql, numberofresults (default: 1). - Purpose: Fetch ticket name and descriptio
Transform your CRM workflow with AI! 🤖✨ Connect Claude directly to your CapsuleCRM account for natural language customer and sales management. - 🗣️ Talk to your CRM in plain English - "Show me all VIP customers from last month" - 🔍 Smart search and filtering - Find exactly what you need with powerful queries - 📊 Get instant insights - Query sales pipeline, customer data, and tasks - ⚡ Automate
MCP server for Offorte - Create & send proposals using AI. This server acts as the bridge between AI agents and Offorte's proposal engine. It enables external models to create and send proposals via Offorte. Built for automation workflows, the MCP makes it easy to integrate proposal actions into AI tools, chat interfaces, and autonomous systems. - About Offorte - Goals \& Coverage - Prerequisites
godoc-mcp is a Model Context Protocol (MCP) server that provides efficient access to Go documentation. It helps LLMs understand Go projects by providing direct access to package documentation without needing to read entire source files. godoc-mcp can vastly improve the performance of using LLMs to develop in Go by substantially reducing the number of tokens needed to understand and make use of Go
A Model Context Protocol (MCP) server that lets you seamlessly use OpenAI's models right from Claude. - Direct integration with OpenAI's chat models - Support for multiple models including: - gpt-4o-mini - o1-preview - Simple message passing interface - Basic error handling - Node.js >= 18 (includes npm and npx) - Claude Desktop app - OpenAI API key First, make sure you've got the Claude Desktop a
MCP (Model Context Protocol) server for Nanana AI image generation service powered by Google Gemini's nano banana model. This server allows Claude Desktop and other MCP clients to generate and transform images using nano banana's powerful image generation capabilities. 1. Visit nanana.app and sign in 2. Go to your account dashboard 3. Generate an API token in the "API Access" section 4. Copy and s
This is an MCP server that analyzes the screen with OmniParser and automatically operates the GUI. Confirmed on Windows. This is MIT license, but Excluding submodules and sub packages. OmniParser's repository is CC-BY-4.0. Each OmniParser model has a different license (reference). 1. Please do the following: (Other than Windows, use export instead of set.) (If you want langchainexample.py to work,