mdskills
← All tags

LLM & AI Agent Skills

AI agent skills for working with large language models. Prompt engineering, API integration, and AI workflow patterns.

223 listings

Timeplus MCP Server

MCP Server

An MCP server for Timeplus. generatesql to give LLM more knowledge about how to query Timeplus via SQL - Execute SQL queries on your Timeplus cluster. - Input: sql (string): The SQL query to execute. - By default, all Timeplus queries are run with readonly = 1 to ensure they are safe. If you want to run DDL or DML queries, you can set the environment variable TIMEPLUSREADONLY to false. listdatabas

8.0jovezhong/mcp-timeplus

Climatiq MCP Server

MCP Server

A Model Context Protocol (MCP) server for accessing the Climatiq API to calculate carbon emissions. This allows AI assistants to perform real-time carbon calculations and provide climate impact insights. This MCP server integrates with the Climatiq API to provide carbon emission calculations for AI assistants: - set-api-key: Configure the Climatiq API key used for authentication - electricity-emis

8.0jagan-shanmugam/climatiq-mcp-server

LLM Context

Smart context management for LLM development workflows. Share relevant project files instantly through intelligent selection and rule-based filtering. Getting the right context into LLM conversations is friction-heavy: - Manually finding and copying relevant files wastes time - Too much context hits token limits, too little misses important details - AI requests for additional files require manual

8.0cyberchitta/llm-context.py

Recursive Decomposition Skill

Plugin

Recursive Decomposition Skill Handle long-context tasks with Claude Code through recursive decomposition What It Does • Installation • How It Works • Benchmarks • Acknowledgments When analyzing large codebases, processing many documents, or aggregating information across dozens of files, Claude's context window becomes a bottleneck. As context grows, "context rot" degrades performance: - Missed de

6.0massimodeluisa/recursive-decomposition-skill

LLM Evaluation

Plugin

LLM evaluation and testing patterns including prompt testing, hallucination detection, benchmark creation, and quality metrics. Use when testing LLM applications, validating prompt quality, implementing systematic evaluation, or measuring LLM performance.

7.0applied-artificial-intelligence/claude-code-toolkit

MCP Chess Server

MCP Server

This MCP let's you play chess against any LLM. To use this chess server, add the following configuration to your MCP config: Play a game: Find a position in a PGN for game analysis: The server provides the following tools: getboardvisualization(): Provides the current state of the chessboard as an image. The board orientation automatically flips based on the user's assigned color. getturn(): Indic

8.0jiayao/mcp-chess

LLM App Patterns

Production-ready patterns for building LLM applications. Covers RAG pipelines, agent architectures, prompt IDEs, and LLMOps monitoring. Use when designing AI applications, implementing RAG, building agents, or setting up LLM observability.

6.0sickn33/antigravity-awesome-skills

GitMCP

MCP Server

What is GitMCP • Features • Getting Started • How It Works • Examples • Contributing • Stop vibe-hallucinating and start vibe-coding! GitMCP is a free, open-source, remote Model Context Protocol (MCP) server that transforms any GitHub project (repositories or GitHub pages) into a documentation hub. It enables AI tools like Cursor to access up-to-date documentation and code, even if the LLM has nev

8.0idosal/git-mcp

Firecrawl Plugin for Claude Code

|

8.0firecrawl/firecrawl-claude-plugin

AI Engineer

Build production-ready LLM applications, advanced RAG systems, and

8.0sickn33/antigravity-awesome-skills

aiagentflow

A local-first CLI that orchestrates multi-agent AI workflows for software development. Give it a task — or feed it your specs, PRDs, and guidelines — and it coordinates specialized agents to architect, code, review, test, and ship automatically. No cloud dependency. Bring your own API keys. Your code stays on your machine. Each stage uses a specialized AI agent with tuned prompts and parameters. T

aiagentflow/aiagentflow

Louis030195/toggl MCP

MCP Server

Dead simple MCP (Model Context Protocol) server for Toggl time tracking. Control your Toggl timer directly from Claude, ChatGPT, or any LLM that supports MCP. - ⏱️ Start/stop timers - 📊 View current timer - 📈 Get today's time entries - 🗂️ List projects - 🗑️ Delete time entries Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claudedesktopconfig.json): 1. Go to Tog

7.0louis030195/toggl-mcp

Video Editor MCP server

MCP Server

See a demo here: https://www.youtube.com/watch?v=KG6TMLD8GmA Upload, edit, search, and generate videos from everyone's favorite LLM and Video Jungle. You'll need to sign up for an account at Video Jungle in order to use this tool, and add your API key. The server implements an interface to upload, generate, and edit videos with: - Custom vj:// URI scheme for accessing individual videos and project

8.0burningion/video-editing-mcp

Clarity Gate

Plugin

Pre-ingestion verification for epistemic quality in RAG systems. Ensures documents are properly qualified before entering knowledge bases. Produces CGD (Clarity-Gated Documents) and validates SOT (Source of Truth) files.

2.0frmoretto/clarity-gate

Prompt Caching

Caching strategies for LLM prompts including Anthropic prompt caching, response caching, and CAG (Cache Augmented Generation) Use when: prompt caching, cache prompt, response cache, cag, cache augmented.

6.0sickn33/antigravity-awesome-skills

LLM Application Dev Prompt Optimize

You are an expert prompt engineer specializing in crafting effective prompts for LLMs through advanced techniques including constitutional AI, chain-of-thought reasoning, and model-specific optimizati

6.0sickn33/antigravity-awesome-skills

Findata MCP Server

MCP Server

Overview • Quick Start • Supported Data Providers • FinData is an open-source Model Context Protocol(MCP) Server that provides professional financial data access capabilities for LLM. It supports various data providers such as Tushare, Wind, DataYes, etc. This enables AI applications to quickly retrieve financial data. Fully supports both Stdio and SSE transports, offering flexibility for differen

8.0zlinzzzz/finData-mcp-server

MetaTrader MCP Server

MCP Server

MetaTrader MCP Server Let AI assistants trade for you using natural language Features • Quick Start • Documentation • Examples • Support MetaTrader MCP Server is a bridge that connects AI assistants (like Claude, ChatGPT) to the MetaTrader 5 trading platform. Instead of clicking buttons, you can simply tell your AI assistant what to do: The AI understands your request and executes it on MetaTrader

8.0ariadng/metatrader-mcp-server

This Project Has Moved!

MCP Server

� Migrate to MCP Platform • 💬 Discord Community • � Legacy Docs Zero-configuration deployment of production-ready MCP servers with Docker containers, comprehensive CLI tools, and intelligent caching. Focus on AI integration, not infrastructure setup. That's it! Your MCP server is running at http://localhost:8080 Perfect for: AI developers, data scientists, DevOps teams building with MCP. Deploy M

3.0Data-Everything/mcp-server-templates

Prompt Engineering

Expert guide on prompt engineering patterns, best practices, and optimization techniques. Use when user wants to improve prompts, learn prompting strategies, or debug agent behavior.

8.0sickn33/antigravity-awesome-skills

Stripe AI

Guide for upgrading Stripe API versions and SDKs

7.0stripe/ai

Open Data Model Context Protocol

Connect Open Data to LLMs in minutes! We enable 2 things: Open Data Access: Access to many public datasets right from your LLM application (starting with Claude, more to come). Publishing: Get community help and a distribution network to distribute your Open Data. Get everyone to use it! How do we do that? Access: Setup our MCP servers in your LLM application in 2 clicks via our CLI tool (starting

7.0OpenDataMCP/OpenDataMCP

Safe Local Python Executor

MCP Server

An MCP server (stdio transport) that wraps Hugging Face's LocalPythonExecutor (from the smolagents framework). It is a custom Python runtime that provides basic isolation/security when running Python code generated by LLMs locally. It does not require Docker or VM. This package allows to expose the Python executor via MCP (Model Context Protocol) as a tool for LLM apps like Claude Desktop, Cursor

8.0maxim-saplin/mcp_safe_local_python_executor

Open Strategy Partners (OSP) Marketing Tools for LLMs

A comprehensive suite of tools for technical marketing content creation, optimization, and product positioning based on Open Strategy Partners' proven methodologies. This software is based on the Model Context Protocol (MCP) and is can be used by any LLM client that supports the MCP. As of early February 2025, the LLM clients that support MCP include: - Claude desktop app is the easiest to use for

7.0open-strategy-partners/osp_marketing_tools