3 tools from the Ask Ollama MCP Server, categorised by risk level.
View the Ask Ollama policy →ask-ollama Send a prompt to a local Ollama LLM (defaults to qwen2.5-coder:7b with automatic fallback). Use for code review, second opinions, analysis, and AI-... get-usage-stats Get the current MCP server's session usage stats: total LLM calls, token totals (input/output/thinking/cached), wall time, and breakdowns per provi... ping Test connectivity with the Ollama MCP server and list locally available models The Ask Ollama MCP server exposes 3 tools across 1 categories: Read.
Use Intercept, the open-source MCP proxy. Write YAML rules for each tool — rate limits, argument validation, or deny rules — then run Intercept in front of the Ask Ollama server.
Ask Ollama tools are categorised as Read (3). Each category has a recommended default policy.
Open source. One binary. Zero dependencies.
npx -y @policylayer/intercept