Create a complete x402 paid AI API project with OpenRouter integration. This creates a NEW PROJECT FOLDER with everything needed to deploy a pay-per-use AI API: - Full Hono.js application with x402 payment middleware - OpenRouter integration for Claude, GPT-4, Llama, etc. - Ready for deployment ...
Accepts file system path (endpoints[].path); High parameter count (14 properties); Bulk/mass operation — affects multiple targets
Part of the Aibtc MCP server. Enforce policies on this tool with Intercept, the open-source MCP proxy.
AI agents invoke scaffold_x402_ai_endpoint to trigger processes or run actions in Aibtc. Execute operations can have side effects beyond the immediate call -- triggering builds, sending notifications, or starting workflows. Rate limits and argument validation are essential to prevent runaway execution.
scaffold_x402_ai_endpoint can trigger processes with real-world consequences. An uncontrolled agent might start dozens of builds, send mass notifications, or kick off expensive compute jobs. Intercept enforces rate limits and validates arguments to keep execution within safe bounds.
Execute tools trigger processes. Rate-limit and validate arguments to prevent unintended side effects.
tools:
scaffold_x402_ai_endpoint:
rules:
- action: allow
rate_limit:
max: 10
window: 60
validate:
required_args: true See the full Aibtc policy for all 288 tools.
Create a complete x402 paid AI API project with OpenRouter integration. This creates a NEW PROJECT FOLDER with everything needed to deploy a pay-per-use AI API: - Full Hono.js application with x402 payment middleware - OpenRouter integration for Claude, GPT-4, Llama, etc. - Ready for deployment to Cloudflare Workers ## What Gets Created A folder named `{projectName}` containing: - src/index.ts - Hono app with your x402-protected AI endpoints - src/x402-middleware.ts - Payment verification (uses native relay fetch) - src/openrouter.ts - OpenRouter API client - wrangler.jsonc - Cloudflare Worker config - .dev.vars - Local dev variables (needs OPENROUTER_API_KEY) - README.md - Documentation ## AI Types - **chat**: General chat/Q&A - **completion**: Text completion - **summarize**: Summarize text - **translate**: Translate text - **custom**: Custom system prompt ## Quick Start After Generation ``` cd {projectName} npm install # Edit .dev.vars with RECIPIENT_ADDRESS and OPENROUTER_API_KEY npm run dev ```. It is categorised as a Execute tool in the Aibtc MCP Server, which means it can trigger actions or run processes. Use rate limits and argument validation.
Add a rule in your Intercept YAML policy under the tools section for scaffold_x402_ai_endpoint. You can allow, deny, rate-limit, or validate arguments. Then run Intercept as a proxy in front of the Aibtc MCP server.
scaffold_x402_ai_endpoint is a Execute tool with high risk. Execute tools should be rate-limited and have argument validation enabled.
Yes. Add a rate_limit block to the scaffold_x402_ai_endpoint rule in your Intercept policy. For example, setting max: 10 and window: 60 limits the tool to 10 calls per minute. Rate limits are tracked per agent session and reset automatically.
Set action: deny in the Intercept policy for scaffold_x402_ai_endpoint. The AI agent will receive a policy violation error and cannot call the tool. You can also include a reason field to explain why the tool is blocked.
scaffold_x402_ai_endpoint is provided by the Aibtc MCP server (@aibtc/mcp-server). Intercept sits as a proxy in front of this server to enforce policies before tool calls reach the server.
Open source. One binary. Zero dependencies.
npx -y @policylayer/intercept