MCP Server - Bridge to local Ollama LLM server. Part of the HumoticaOS / SymbAIon ecosystem. Add to your claudedesktopconfig.json: - Connect MCP clients to local Ollama LLM - Support for all Ollama models - Streaming responses - Simple configuration - Jasper van de Meent (@jaspertvdm) - Root AI (Claude) - rootai@humotica.nl One Love, One fAmIly! This package is officially distributed via: - PyPI:
Add this skill
npx mdskills install jaspertvdm/mcp-server-ollama-bridgeClear setup for bridging MCP to Ollama but lacks tool descriptions and implementation details
No comments yet. Sign in to start the discussion.
Threaded comments with markdown support coming soon.