Loading connector details…
Loading connector details…
Choose a unique username to continue using AgentHotspot
by vladimiracunadev-create • Uncategorized
A lightweight local chat platform integrating Ollama LLMs with Model Context Protocol and secure tools.
A fully local AI chat environment with privacy and no cloud dependencies.
Extend LLM capabilities with secure system tools like file access and system info.
Persistent conversation history stored locally with SQLite.
MCP Ollama Local enables running a fully local AI chat environment combining Ollama's local LLMs with the extensible Model Context Protocol (MCP). It provides a clean web interface, secure system tools, and persistent conversation history stored in SQLite, ensuring total privacy by running entirely on the user's machine. The platform supports flexible deployment including local, Docker, and Kubernetes environments.