Loading connector details…
Loading connector details…
Choose a unique username to continue using AgentHotspot
by kevensen • Uncategorized
An MCP server in Go to interact with local Ollama models via the MCP protocol.
Manage Ollama models locally through MCP protocol.
Generate text completions and chat responses using Ollama models.
Generate vector embeddings from text using Ollama.
Gollama MCP Server enables AI assistants to communicate with local Ollama models using the Model Context Protocol (MCP). It supports model management tasks such as listing, pulling, pushing, deleting, copying, and viewing models, as well as generating completions, chat interactions, and embeddings. The server can run in STDIO or HTTP mode and is easily deployable via Docker, facilitating seamless integration with Ollama instances.