Loading connector details…
Loading connector details…
Choose a unique username to continue using AgentHotspot
by NewAITees • Modeling & Simulation
A local MCP server that bridges Ollama LLM instances with MCP-compatible applications to provide task decomposition, result evaluation, and model execution.
Break complex user requests into prioritized, dependency-aware subtasks using Ollama-driven decomposition.
Evaluate and score task results against custom criteria and generate actionable improvement suggestions.
Run queries against local Ollama models with configurable parameters (model selection, temperature, max tokens) and retrieve available model lists.
This server implements an MCP-compatible interface to interact with local Ollama models, offering tools for task decomposition, result evaluation, and direct model execution. It provides standardized URI schemes (task://, result://, model://), schema-driven prompts, robust error handling, and performance optimizations like connection pooling and LRU caching. The project is intended for integrating Ollama into agent workflows and developer tools that use the Model Context Protocol.
Scores are informational only and provided “as is” without warranty. AgentHotspot assumes no liability for actions taken based on these ratings.