5 tools from the Ask Llm MCP Server, categorised by risk level.
View the Ask Llm policy →ask-llm Send a prompt to an LLM provider (Gemini, Codex, Ollama). Specify which provider to use. Each provider auto-selects its best model with fallback on... get-usage-stats Get the current MCP server's session usage stats: total LLM calls, token totals (input/output/thinking/cached), wall time, and breakdowns per provi... multi-llm Dispatch the same prompt to multiple LLM providers in parallel and return all responses in one structured payload. Use when you want to compare ans... ping Test connectivity with the MCP server The Ask Llm MCP server exposes 5 tools across 2 categories: Read, Execute.
Use Intercept, the open-source MCP proxy. Write YAML rules for each tool — rate limits, argument validation, or deny rules — then run Intercept in front of the Ask Llm server.
Ask Llm tools are categorised as Read (4), Execute (1). Each category has a recommended default policy.
Open source. One binary. Zero dependencies.
npx -y @policylayer/intercept