A Mattermost integration that connects to Model Context Protocol (MCP) servers, leveraging a LangGraph-based AI agent to provide an intelligent interface for interacting with users and executing tools directly within Mattermost. - ๐ค Langgraph Agent Integration: Uses a LangGraph agent to understand user requests and orchestrate responses. - ๐ MCP Server Integration: Connects to multiple MCP serve
Add this skill
npx mdskills install jagan-shanmugam/mattermost-mcp-hostWell-documented Mattermost-MCP integration with LangGraph agent and dynamic tool loading
1# Mattermost MCP Host23A Mattermost integration that connects to Model Context Protocol (MCP) servers, leveraging a LangGraph-based AI agent to provide an intelligent interface for interacting with users and executing tools directly within Mattermost.456789101112## Demo1314### 1. Github Agent in support channel - searches the existing issues and PRs and creates a new issue if not found15161718### 2. Search internet and post to a channel using Mattermost-MCP-server192021#### Scroll below for full demo in YouTube2223## Features2425- ๐ค **Langgraph Agent Integration**: Uses a LangGraph agent to understand user requests and orchestrate responses.26- ๐ **MCP Server Integration**: Connects to multiple MCP servers defined in `mcp-servers.json`.27- ๐ ๏ธ **Dynamic Tool Loading**: Automatically discovers tools from connected MCP servers and makes them available to the AI agent. Converts MCP tools to langchain structured tools.28- ๐ฌ **Thread-Aware Conversations**: Maintains conversational context within Mattermost threads for coherent interactions.29- ๐ **Intelligent Tool Use**: The AI agent can decide when to use available tools (including chaining multiple calls) to fulfill user requests.30- ๐ **MCP Capability Discovery**: Allows users to list available servers, tools, resources, and prompts via direct commands.31- #๏ธโฃ **Direct Command Interface**: Interact directly with MCP servers using a command prefix (default: `#`).323334## Overview3536The integration works as follows:37381. **Mattermost Connection (`mattermost_client.py`)**: Connects to the Mattermost server via API and WebSocket to listen for messages in a specified channel.392. **MCP Connections (`mcp_client.py`)**: Establishes connections (primarily `stdio`) to each MCP server defined in `src/mattermost_mcp_host/mcp-servers.json`. It discovers available tools on each server.403. **Agent Initialization (`agent/llm_agent.py`)**: A `LangGraphAgent` is created, configured with the chosen LLM provider and the dynamically loaded tools from all connected MCP servers.414. **Message Handling (`main.py`)**:42 * If a message starts with the command prefix (`#`), it's parsed as a direct command to list servers/tools or call a specific tool via the corresponding `MCPClient`.43 * Otherwise, the message (along with thread history) is passed to the `LangGraphAgent`.445. **Agent Execution**: The agent processes the request, potentially calling one or more MCP tools via the `MCPClient` instances, and generates a response.456. **Response Delivery**: The final response from the agent or command execution is posted back to the appropriate Mattermost channel/thread.4647## Setup481. **Clone the repository:**49 ```bash50 git clone <repository-url>51 cd mattermost-mcp-host52 ```53542. **Install:**55 * Using uv (recommended):56 ```bash57 # Install uv if you don't have it yet58 # curl -LsSf https://astral.sh/uv/install.sh | sh5960 # Activate venv61 source .venv/bin/activate6263 # Install the package with uv64 uv sync6566 # To install dev dependencies67 uv sync --dev --all-extras68 ```69703. **Configure Environment (`.env` file):**71 Copy the `.env.example` and fill in the values or72 Create a `.env` file in the project root (or set environment variables):73 ```env74 # Mattermost Details75 MATTERMOST_URL=http://your-mattermost-url76 MATTERMOST_TOKEN=your-bot-token # Needs permissions to post, read channel, etc.77 MATTERMOST_TEAM_NAME=your-team-name78 MATTERMOST_CHANNEL_NAME=your-channel-name # Channel for the bot to listen in79 # MATTERMOST_CHANNEL_ID= # Optional: Auto-detected if name is provided8081 # LLM Configuration (Azure OpenAI is default)82 DEFAULT_PROVIDER=azure83 AZURE_OPENAI_ENDPOINT=your-azure-endpoint84 AZURE_OPENAI_API_KEY=your-azure-api-key85 AZURE_OPENAI_DEPLOYMENT=your-deployment-name # e.g., gpt-4o86 # AZURE_OPENAI_API_VERSION= # Optional, defaults provided8788 # Optional: Other providers (install with `[all]` extra)89 # OPENAI_API_KEY=...90 # ANTHROPIC_API_KEY=...91 # GOOGLE_API_KEY=...9293 # Command Prefix94 COMMAND_PREFIX=#95 ```96 See `.env.example` for more options.97984. **Configure MCP Servers:**99 Edit `src/mattermost_mcp_host/mcp-servers.json` to define the MCP servers you want to connect to. See `src/mattermost_mcp_host/mcp-servers-example.json`.100 Depending on the server configuration, you might `npx`, `uvx`, `docker` installed in your system and in path.1011025. **Start the Integration:**103 ```bash104 mattermost-mcp-host105 ```106107108## Prerequisites109110- Python 3.13.1+111- uv package manager112- Mattermost server instance113- Mattermost Bot Account with API token114- Access to a LLM API (Azure OpenAI)115116### Optional117- One or more MCP servers configured in `mcp-servers.json`118- Tavily web search requires `TAVILY_API_KEY` in `.env` file119120121## Usage in Mattermost122123Once the integration is running and connected:1241251. **Direct Chat:** Simply chat in the configured channel or with the bot. The AI agent will respond, using tools as needed. It maintains context within message threads.1262. **Direct Commands:** Use the command prefix (default `#`) for specific actions:127 * `#help` - Display help information.128 * `#servers` - List configured and connected MCP servers.129 * `#<server_name> tools` - List available tools for `<server_name>`.130 * `#<server_name> call <tool_name> <json_arguments>` - Call `<tool_name>` on `<server_name>` with arguments provided as a JSON string.131 * Example: `#my-server call echo '{"message": "Hello MCP!"}'`132 * `#<server_name> resources` - List available resources for `<server_name>`.133 * `#<server_name> prompts` - List available prompts for `<server_name>`.134135136137## Next Steps138- โ๏ธ **Configurable LLM Backend**: Supports multiple AI providers (Azure OpenAI default, OpenAI, Anthropic Claude, Google Gemini) via environment variables.139140## Mattermost Setup1411421. **Create a Bot Account**143- Go to Integrations > Bot Accounts > Add Bot Account144- Give it a name and description145- Save the access token in the .env file1461472. **Required Bot Permissions**148- post_all149- create_post150- read_channel151- create_direct_channel152- read_user1531543. **Add Bot to Team/Channel**155- Invite the bot to your team156- Add bot to desired channels157158### Troubleshooting1591601. **Connection Issues**161- Verify Mattermost server is running162- Check bot token permissions163- Ensure correct team/channel names1641652. **AI Provider Issues**166- Validate API keys167- Check API quotas and limits168- Verify network access to API endpoints1691703. **MCP Server Issues**171- Check server logs172- Verify server configurations173- Ensure required dependencies are installed and env variables are defined174175176## Demos177178### Create issue via chat using Github MCP server179180181### (in YouTube)182[](https://youtu.be/s6CZY81DRrU)183184185## Contributing186187Please feel free to open a PR.188189## License190191This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.192
Full transparency โ inspect the skill content before installing.