Debug AI traces, find exceptions, analyze sessions, and manage prompts via Langfuse MCP. Also handles MCP setup and configuration.
Add this skill
npx mdskills install avivsinai/langfuseComprehensive LLM observability guide with practical integration patterns and clear examples
Model Context Protocol server for Langfuse observability. Query traces, debug errors, analyze sessions, manage prompts.
Comparison with official Langfuse MCP (as of Jan 2026):
| langfuse-mcp | Official | |
|---|---|---|
| Traces & Observations | Yes | No |
| Sessions & Users | Yes | No |
| Exception Tracking | Yes | No |
| Prompt Management | Yes | Yes |
| Dataset Management | Yes | No |
| Selective Tool Loading | Yes | No |
This project provides a full observability toolkit — traces, observations, sessions, exceptions, and prompts — while the official MCP focuses on prompt management.
Requires uv (for uvx).
Get credentials from Langfuse Cloud → Settings → API Keys. If self-hosted, use your instance URL for LANGFUSE_HOST.
# Claude Code (project-scoped, shared via .mcp.json)
claude mcp add \
-e LANGFUSE_PUBLIC_KEY=pk-... \
-e LANGFUSE_SECRET_KEY=sk-... \
-e LANGFUSE_HOST=https://cloud.langfuse.com \
--scope project \
langfuse -- uvx --python 3.11 langfuse-mcp
# Codex CLI (user-scoped, stored in ~/.codex/config.toml)
codex mcp add langfuse \
--env LANGFUSE_PUBLIC_KEY=pk-... \
--env LANGFUSE_SECRET_KEY=sk-... \
--env LANGFUSE_HOST=https://cloud.langfuse.com \
-- uvx --python 3.11 langfuse-mcp
Restart your CLI, then verify with /mcp (Claude Code) or codex mcp list (Codex).
| Category | Tools |
|---|---|
| Traces | fetch_traces, fetch_trace |
| Observations | fetch_observations, fetch_observation |
| Sessions | fetch_sessions, get_session_details, get_user_sessions |
| Exceptions | find_exceptions, find_exceptions_in_file, get_exception_details, get_error_count |
| Prompts | list_prompts, get_prompt, get_prompt_unresolved, create_text_prompt, create_chat_prompt, update_prompt_labels |
| Datasets | list_datasets, get_dataset, list_dataset_items, get_dataset_item, create_dataset, create_dataset_item, delete_dataset_item |
| Schema | get_data_schema |
Langfuse uses upsert for dataset items. To edit an existing item, call create_dataset_item with item_id. If the ID exists, it updates; otherwise it creates a new item.
create_dataset_item(
dataset_name="qa-test-cases",
item_id="item_123",
input={"question": "What is 2+2?"},
expected_output={"answer": "4"}
)
This project includes a skill with debugging playbooks.
Via skills (recommended):
npx skills add avivsinai/langfuse-mcp -g -y
Via skild:
npx skild install @avivsinai/langfuse -t claude -y
Manual install:
cp -r skills/langfuse ~/.claude/skills/ # Claude Code
cp -r skills/langfuse ~/.codex/skills/ # Codex CLI
Try asking: "help me debug langfuse traces"
See skills/langfuse/SKILL.md for full documentation.
Load only the tool groups you need to reduce token overhead:
langfuse-mcp --tools traces,prompts
Available groups: traces, observations, sessions, exceptions, prompts, datasets, schema
Disable all write operations for safer read-only access:
langfuse-mcp --read-only
# Or via environment variable
LANGFUSE_MCP_READ_ONLY=true langfuse-mcp
This disables: create_text_prompt, create_chat_prompt, update_prompt_labels, create_dataset, create_dataset_item, delete_dataset_item
Create .cursor/mcp.json in your project (or ~/.cursor/mcp.json for global):
{
"mcpServers": {
"langfuse": {
"command": "uvx",
"args": ["--python", "3.11", "langfuse-mcp"],
"env": {
"LANGFUSE_PUBLIC_KEY": "pk-...",
"LANGFUSE_SECRET_KEY": "sk-...",
"LANGFUSE_HOST": "https://cloud.langfuse.com"
}
}
}
}
docker run --rm -i \
-e LANGFUSE_PUBLIC_KEY=pk-... \
-e LANGFUSE_SECRET_KEY=sk-... \
-e LANGFUSE_HOST=https://cloud.langfuse.com \
ghcr.io/avivsinai/langfuse-mcp:latest
uv venv --python 3.11 .venv && source .venv/bin/activate
uv pip install -e ".[dev]"
pytest
MIT
Best experience: Claude Code
/plugin marketplace add avivsinai/langfuseThen /plugin menu → select skill → restart. Use /skill-name:init for first-time setup.
Other platforms
Install via CLI
npx mdskills install avivsinai/langfuseLangfuse MCP Server is a free, open-source AI agent skill. Debug AI traces, find exceptions, analyze sessions, and manage prompts via Langfuse MCP. Also handles MCP setup and configuration.
Install Langfuse MCP Server with a single command:
npx mdskills install avivsinai/langfuseThis downloads the skill files into your project and your AI agent picks them up automatically.
Langfuse MCP Server works with Claude Code, Claude Desktop, Cursor, Vscode Copilot, Windsurf, Continue Dev, Gemini Cli, Amp, Roo Code, Goose. Skills use the open SKILL.md format which is compatible with any AI coding agent that reads markdown instructions.