Loading connector details…
Loading connector details…
Choose a unique username to continue using AgentHotspot
by VyacheslavVanin • APIs & Integration
A FastAPI-based chat server that exposes endpoints for interacting with LLMs (Ollama or OpenAI) including session management and tool-approval workflows.
A hosted API layer to send user messages to Ollama or OpenAI models with configurable model, provider, and request parameters.
A built-in tool approval workflow to require human approval before executing external tool calls or actions.
Session management and optional streaming responses to maintain context across interactions and deliver incremental outputs.
This repository provides a configurable FastAPI server for chat interactions with LLM providers (Ollama or OpenAI). It exposes endpoints to start sessions, send user messages, handle tool invocation approvals, and retrieve session state, with configuration via environment variables or CLI arguments. The server supports streaming responses and is designed to integrate LLM capabilities into applications while controlling tool calls through an approval process.
Scores are informational only and provided “as is” without warranty. AgentHotspot assumes no liability for actions taken based on these ratings.