Ollama MCP Server
LOCALhyzhak/ollama-mcp-serverA bridge that integrates Ollama's local LLM capabilities into MCP-powered applications, enabling users to run, manage, and interact with AI models locally with full control and privacy.
⚠ Local (STDIO) Server
This server runs as a process directly on your machine. It has access to your filesystem, environment variables, and SSH keys. Review the source code before installing.