Home
Live · last 60 min
—Cost Trends
Last 7 daysProvider Load & Routing
Top Spending Users
Create User expand
| Name | Status | Role | Weekly Cap | Organization | Actions |
|---|
| Name | Slug | Owner | Members | API Spend Cap | Rollover | DLP | Created | Actions |
|---|
Create App expand
| Name | Slug | Contact | Active | DLP | Created | Actions |
|---|
Model Router
Configure model aliases to override what LLM is actually used when clients request a specific model via the OpenAI or Anthropic gateway endpoints. Org-specific aliases take priority over global ones.
| Requested Model | → | Routed To | Scope | Actions |
|---|
Agent Configurations
model_config.yaml.
| Name | Agent ID | Role | Normal Model | Heavy Model | Max Model |
|---|
Available Models
Gateway Logs
Recent API requests through the OpenAI/Anthropic gateway endpoints.
| Time | Key | User | Endpoint | Requested | Actual | Tokens | Cost |
|---|
Per-User Usage
| User | Requests | Input Tokens | Output Tokens | Cost | Actions |
|---|
[EMAIL_1]
before reaching the LLM provider — the model can still reason about them by reference, but the raw value never leaves your infrastructure.
Turn protection on per-organization (for user keys) or per-app (for app keys).
| Name | Status | Rules enabled | Last activity |
|---|---|---|---|
| Loading… | |||
Recent activity
| When | Kind | Scope | Rule | Matches | Size | Request | Detail |
|---|---|---|---|---|---|---|---|
| Enable protection for an organization or app, then make a request — activity will appear here. | |||||||
Gateway Settings
Configure gateway-wide features. Users can opt-in to individual features from their dashboard.
Routing Strategy
Controls how the gateway orders fallback models on each request. Takes effect immediately — no restart required. Super-admin only.
Ollama (Self-Hosted Models)
Connect a remote or local Ollama instance to route requests through self-hosted models at zero cost. Models are referenced as ollama/model-name.
LLM Provider Credentials
Manage API keys and connection settings for each LLM provider without editing .env. Stored AES-256-GCM encrypted. Database values override environment variables at startup; changes hot-reload immediately. Super-admin only.
Model Fallback Chains
Configure which model steps in when a primary model fails (provider error, credit expired, or downtime).
| Primary Model | Fallback Chain | Action |
|---|
User Activity Export
Export all API activity for a specific user as a downloadable CSV. Includes input/output conversations, token counts, model used, cost, and latency. Useful for auditing and debugging user-side issues.
Error Reports
| Time | Source | Type | Model | Error | User | Agent | Version |
|---|