AI & MCP
Zerq natively supports the Model Context Protocol (MCP) — the emerging standard for connecting AI models to tools and APIs. This means AI assistants, coding agents, and autonomous systems can discover and call your APIs through the same gateway that your apps use, with the same credentials, rate limits, and audit trail.
What is MCP?
The Model Context Protocol is an open protocol that standardizes how AI models interact with external tools and data sources. MCP-compatible AI tools (Claude, GitHub Copilot, Cursor, etc.) can discover available tools and invoke them.
Zerq exposes two MCP endpoints:
| Endpoint | Who uses it | What it does |
|---|---|---|
| MCP Gateway | AI agents calling your APIs | Discover and execute your published API endpoints |
| MCP Management | AI operators managing the platform | Manage collections, proxies, policies, and workflows; includes MCP prompts support |
Why this matters
For AI-powered apps
Your mobile app, your partner integration, and your AI agent can all use the same gateway:
- Same credentials and authentication
- Same rate limits and quotas
- Same audit trail — all in one view
No separate "AI endpoint" or second set of keys.
For compliance teams
All AI-originated requests appear in your existing metrics, logs, and dashboards. Compliance and security teams have a single, unified view of all API usage — human and AI.
For platform engineers
AI agents that manage the platform (creating collections, updating workflows, etc.) use the same auth model as the admin UI. One configuration, one audit trail.
How MCP works with Zerq
- AI agent connects using MCP (JSON-RPC over HTTP/SSE transport).
tools/listreturns published, permitted gateway operations.tools/callexecutes the selected operation through Zerq routing.- Gateway security checks apply
X-Client-ID,X-Profile-ID, auth, and policy limits. - Observability remains unified: requests appear in standard logs and metrics.
- Backend handling is unchanged: calls ultimately reach the same upstream services as non-AI traffic.
Management MCP also supports prompts/list and prompts/get for guided agent workflows.
Use cases
1. AI assistant that answers questions via live APIs
Build a chatbot that can answer "What's the status of order #1234?" by calling your order management API through Zerq. The AI discovers the available endpoints and executes them on behalf of the user.
2. Coding agent with API access
Give a coding agent (like Cursor or Claude) access to your internal APIs so it can look up documentation, fetch schemas, or trigger builds — all through the same controlled gateway.
3. Automated platform management
Use an AI agent with MCP Management access to automate platform tasks: create new API collections from OpenAPI specs, update rate limits, or build workflows from natural language descriptions.
4. Full visibility and control
Apply the same per-client rate limits to AI clients as to human applications. If an AI agent is abusing the API, the same throttling and quota enforcement kicks in.
Next steps
- MCP Gateway — connect AI agents to your APIs
- MCP Management — let AI manage the platform
- Cursor Setup — configure Cursor MCP connectivity
- Claude Setup — configure Claude MCP connectivity
- ChatGPT Setup — configure ChatGPT MCP connectivity