10 tools from the Pincer MCP Server, categorised by risk level.
View the Pincer policy →claude_chat Chat completions with Anthropic Claude models (Claude 3.5 Sonnet, Opus, Haiku). openai_chat Chat completions with OpenAI GPT models (gpt-4o, gpt-4-turbo, gpt-3.5-turbo, etc.). openai_compatible_chat Chat completions with **any** OpenAI-compatible API (Azure OpenAI, Ollama, vLLM, etc.). openai_compatible_list_models List models from custom OpenAI-compatible endpoints. openai_list_models List all available OpenAI models. openrouter_chat Unified API access to 100+ models from multiple providers (OpenAI, Anthropic, Google, Meta, etc.). openrouter_list_models List all available models across OpenRouter providers. openwebui_chat OpenAI-compatible interface for self-hosted LLMs. openwebui_list_models Discover available models on an OpenWebUI instance. The Pincer MCP server exposes 10 tools across 2 categories: Read, Write.
Use Intercept, the open-source MCP proxy. Write YAML rules for each tool — rate limits, argument validation, or deny rules — then run Intercept in front of the Pincer server.
Pincer tools are categorised as Read (9), Write (1). Each category has a recommended default policy.
Open source. One binary. Zero dependencies.
npx -y @policylayer/intercept