Loading connector details…
Loading connector details…
Choose a unique username to continue using AgentHotspot
by pblagoje • Uncategorized
A Python MCP server exposing local Ollama models as tools for AI assistants without internet.
Chat with local Ollama models using multi-turn conversations and tool-calling.
Generate text completions or create vector embeddings locally.
Manage local Ollama models including listing, pulling, deleting, and inspecting models.
This MCP server allows AI assistants to access and use local Ollama large language models directly on the user's computer, eliminating the need for cloud APIs. It provides 8 MCP tools for chatting, text generation, embeddings, model management, and more, with a hot-swap architecture for easy tool addition. The server is lightweight, type-safe, and compatible with any MCP client, enabling seamless integration with assistants like Windsurf, VS Code, and Claude Desktop.