Query and analyze LLM traces with AI assistance. Ask Claude to find expensive API calls, debug errors, compare model performance, or track token usage—all from your IDE. An MCP (Model Context Protocol) server that connects AI assistants to OpenTelemetry trace backends (Jaeger, Tempo, Traceloop), with specialized support for LLM observability through OpenLLMetry semantic conventions. See it in acti
Add this skill
npx mdskills install traceloop/opentelemetry-mcp-serverComprehensive MCP server for OpenTelemetry trace analysis with excellent LLM-specific tooling and multi-backend support
No comments yet. Sign in to start the discussion.
Threaded comments with markdown support coming soon.