AI tools for your
entire team.
One workspace URL for your engineering team. MCP servers with zero-trust secrets, per-user analytics, and instant access revocation. No API keys on developer machines. Full audit trail on every tool call.
Four pillars. Zero boilerplate.
AI Gateway
Multi-model routing to OpenAI, Anthropic, Gemini, Azure OpenAI, and Cloudflare AI from a single endpoint. BYOK, fallback chains, per-consumer token wallets, and metered billing built in.
MCP Workspace
Aggregate N MCP servers behind one authenticated URL. Auto-namespace tools, inject team secrets, and expose your private registry to Claude, ChatGPT, Cursor, or any MCP-compatible client.
Real-Time Infrastructure
WebSocket pub/sub, presence tracking, and database change streaming — all on Cloudflare's edge. Enterprise plans include 25K concurrent connections with automatic hibernation and scaling.
Developer Auth
Email OTP, magic link, BYO-JWT (connect your existing IdP), refresh token rotation, Turnstile CAPTCHA, and per-project end-user lifecycle management — all configurable without writing auth code.
Route any model.
Bill any way.
One endpoint. Any LLM. Your keys. Enterprise teams deploy AI APIs with multi-provider fallback chains, per-consumer rate limits, and metered billing — all without touching infrastructure.
Works with
import Aerostack from '@aerostack/node';
const client = new Aerostack({
projectSlug: 'my-project',
apiKey: process.env.AEROSTACK_API_KEY,
});
// Route to any model — OpenAI, Anthropic, Gemini, Azure
const stream = await client.gateway.chat({
gatewaySlug: 'my-ai-api',
consumerKey: 'ask_live_xxxxxxxxxxxx',
messages: [{ role: 'user', content: 'Hello!' }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.content ?? '');
}One gateway. Every tool.
AI Clients
MCP Workspace
gateway.aerostack.dev/ws/{workspace-slug}
Single authenticated URL · Tool namespacing · BYOK secrets
Your Internal Tools
Your team's private tool registry — aggregate any number of MCP servers behind a single authenticated endpoint. Every AI model your team uses can reach the same unified toolset.
- Private by default — invisible to the public marketplace
- BYOK secret injection — your internal API keys never leave your config
- Automatic tool namespacing — prevent conflicts across servers
- Fan-out tools/list in parallel — fast cold start across N servers
- Works with Claude, ChatGPT, Cursor, and any MCP-compatible client
Security is not a feature. It's the foundation.
300+
Edge locations
<50ms
P95 latency
25K
Realtime connections
10M
API req / month
9
SDK languages
Know exactly what's happening.
17 Usage Metrics
API requests, database queries, cache ops, storage, AI tokens, realtime connections — all tracked automatically, per project.
Anomaly Detection
Automatic abuse scoring (0–100). Score above 80 triggers burst throttle. Score above 100 flags for review.
Full Audit Trail
30-day retention on every user action — entity creation, updates, deletions — with actor, timestamp, and diff.
5-Tier Logging
Edge logs → platform errors → developer traces → analytics events → usage rollups. Five levels, all automatic.
Per-Request Latency
P50 / P95 / P99 tracked per route and gateway API. Error rates broken down by status code.
Webhook Execution Logs
Full request/response bodies logged for every hook call. Status, latency, retry history all visible in dashboard.
Same DX.
Any scale.
Enterprise teams get the same CLI, SDKs, and type-safe RPC client as the rest of the platform. No new tooling to learn, no lock-in. Just higher limits.
SDK Languages
CLI
npx aerostack createnpx aerostack deploynpx aerostack add <slug>import Aerostack from '@aerostack/node';
const client = new Aerostack({
projectSlug: 'my-project',
apiKey: process.env.AEROSTACK_API_KEY,
});
// Route to any model — OpenAI, Anthropic, Gemini, Azure
const stream = await client.gateway.chat({
gatewaySlug: 'my-ai-api',
consumerKey: 'ask_live_xxxxxxxxxxxx',
messages: [{ role: 'user', content: 'Hello!' }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.content ?? '');
}Enterprise vs Business
| Aerostack Enterprise | Business ($149/mo) | |
|---|---|---|
| API requests / mo | Custom | 10M |
| Projects | Custom | 50 |
| Realtime connections | Custom | 25K |
| AI tokens / mo | Custom | 100M |
| MCP workspaces | Unlimited | 10 |
| Support | Dedicated CSM | Email only |
| SLA | 99.99% | Best effort |
| Onboarding | Dedicated setup | Self-serve |
| Billing | Custom invoice | Stripe portal |
How Enterprise works.
Talk to Us
Tell us your team size, AI stack, and compliance requirements.
Custom Setup
We configure your private MCP workspace, limits, and dedicated support.
Ship in Days
Same CLI, same SDKs — enterprise-grade from day one.
Custom pricing starting from Business tier ($149/mo). Volume discounts and annual invoicing available.
View standard plansReady to deploy your
AI infrastructure?
Talk to our team about custom limits, private MCP workspaces, and dedicated support.
hello@aerostack.dev