Dobby

Agentic Gateway

Central middleware routing all LLM and MCP requests through authenticated keys. One endpoint for every AI interaction — unified cost tracking, policy enforcement, and audit trail.

Quick Start

1

Create a Gateway Key

Go to your workspace Settings > Gateway and generate a key.

2

Make an LLM call with the OpenAI SDK

No custom dependencies — use the standard OpenAI SDK and point it at Dobby.

First, install the SDK: pip install openai
Python
from openai import OpenAI

client = OpenAI(
    api_key="<YOUR_GATEWAY_KEY>",
    base_url="https://dobby-ai.com/api/v1/gateway"
)

response = client.chat.completions.create(
    model="claude-sonnet-4-20250514",
    messages=[{"role": "user", "content": "Hello from Dobby!"}]
)

print(response.choices[0].message.content)

Uses the standard OpenAI SDK — no custom dependencies. Switch providers by changing the model parameter. Supports Claude, GPT-4, Gemini, Mistral, and 9 more.

3

See your cost

Open the Gateway Usage tab in your dashboard. Every request is tracked with cost, latency, and token count.

API Key Tiers

PrefixTypeRate LimitUse Case
gk_user_*User100 RPMFor individual developers
gk_svc_*Service500 RPMFor backend services and agents
gk_tmp_*Temporary50 RPMAuto-expire, for testing or demos

Capabilities

LLM Proxy

OpenAI-compatible endpoint for 13+ providers — Claude, GPT-4, Gemini, Mistral, and more. Switch models by changing one parameter.

MCP Server

73+ tools via JSON-RPC 2.0. Connect Claude Desktop, Cursor, VS Code, or any MCP-compatible client.

Cost Tracking

Automatic per-request cost attribution by agent, user, and provider. Budget enforcement with alerts at 80/90/100%.

Policy Engine

Model restrictions, content filtering, DLP (9 PII patterns), rate limiting, and 4-layer policy hierarchy.

Agent Proxy

Route requests from external agents (CrewAI, LangSmith, custom REST). Auto-discovery registers agents on first call.

Kill-Switch

Emergency control to block all gateway traffic for an organization. Three scopes: all, LLM-only, new keys only.

Key Endpoints

POST /api/v1/gateway/chat/completions— LLM proxy (OpenAI-compatible)
POST /api/v1/gateway/mcp/endpoint— MCP tools (JSON-RPC 2.0)
GET /api/v1/gateway/models— List available models
GET /api/v1/gateway/health— Public health check
Full API ReferenceGateway Academy Guide