MCP Server - Bridge to local Ollama LLM server. Part of the HumoticaOS / SymbAIon ecosystem. Add to your claudedesktopconfig.json: - Connect MCP clients to local Ollama LLM - Support for all Ollama models - Streaming responses - Simple configuration - Jasper van de Meent (@jaspertvdm) - Root AI (Claude) - rootai@humotica.nl One Love, One fAmIly! This package is officially distributed via: - PyPI:
Add this skill
npx mdskills install jaspertvdm/mcp-server-ollama-bridgeClear setup for bridging MCP to Ollama but lacks tool descriptions and implementation details
No forks yet. Be the first to fork and customize this skill.
Visual fork tree and fork list coming soon.