cognee‑mcp - Run cognee’s memory engine as a Model Context Protocol server Join Discord Join r/AIMemory Build memory for Agents and query from any client that speaks MCP – in your terminal or IDE. - Multiple transports – choose Streamable HTTP --transport http (recommended for web deployments), SSE --transport sse (real‑time streaming), or stdio (classic pipe, default) - API Mode – connect to an a
Add this skill
npx mdskills install topoteretes/cogneeWell-documented memory MCP server with multiple transports, Docker support, and comprehensive setup
No forks yet. Be the first to fork and customize this skill.
Visual fork tree and fork list coming soon.