Aerostack
Aerostack
Enterprise

AI tools for your
entire team.

One workspace URL for your engineering team. MCP servers with zero-trust secrets, per-user analytics, and instant access revocation. No API keys on developer machines. Full audit trail on every tool call.

//Platform

Four pillars. Zero boilerplate.

AI Gateway

Multi-model routing to OpenAI, Anthropic, Gemini, Azure OpenAI, and Cloudflare AI from a single endpoint. BYOK, fallback chains, per-consumer token wallets, and metered billing built in.

MCP Workspace

Aggregate N MCP servers behind one authenticated URL. Auto-namespace tools, inject team secrets, and expose your private registry to Claude, ChatGPT, Cursor, or any MCP-compatible client.

Real-Time Infrastructure

WebSocket pub/sub, presence tracking, and database change streaming — all on Cloudflare's edge. Enterprise plans include 25K concurrent connections with automatic hibernation and scaling.

Developer Auth

Email OTP, magic link, BYO-JWT (connect your existing IdP), refresh token rotation, Turnstile CAPTCHA, and per-project end-user lifecycle management — all configurable without writing auth code.

AI Gateway

Route any model.
Bill any way.

One endpoint. Any LLM. Your keys. Enterprise teams deploy AI APIs with multi-provider fallback chains, per-consumer rate limits, and metered billing — all without touching infrastructure.

Multi-provider fallbackBYOK secretsSSE streamingToken walletsMetered billingIP filtering

Works with

OpenAIAnthropicGeminiAzure OpenAICloudflare AI
gateway-stream
import Aerostack from '@aerostack/node';

const client = new Aerostack({
  projectSlug: 'my-project',
  apiKey: process.env.AEROSTACK_API_KEY,
});

// Route to any model — OpenAI, Anthropic, Gemini, Azure
const stream = await client.gateway.chat({
  gatewaySlug: 'my-ai-api',
  consumerKey: 'ask_live_xxxxxxxxxxxx',
  messages: [{ role: 'user', content: 'Hello!' }],
  stream: true,
});

for await (const chunk of stream) {
  process.stdout.write(chunk.content ?? '');
}
//MCP Workspace

One gateway. Every tool.

AI Clients

ClaudeChatGPTCursorGemini

MCP Workspace

gateway.aerostack.dev/ws/{workspace-slug}

Single authenticated URL · Tool namespacing · BYOK secrets

Your Internal Tools

CRM API
Docs MCP
DB Search
Analytics
Alerts
Custom

Your team's private tool registry — aggregate any number of MCP servers behind a single authenticated endpoint. Every AI model your team uses can reach the same unified toolset.

  • Private by default — invisible to the public marketplace
  • BYOK secret injection — your internal API keys never leave your config
  • Automatic tool namespacing — prevent conflicts across servers
  • Fan-out tools/list in parallel — fast cold start across N servers
  • Works with Claude, ChatGPT, Cursor, and any MCP-compatible client
ClaudeChatGPTCursorGemini
//Security

Security is not a feature. It's the foundation.

PBKDF2 600k iterations
AES-256 encryption at rest
TLS 1.3 in transit
BYO-JWT (connect your IdP)
HMAC-SHA256 signed webhooks
Mandatory refresh token rotation
Per-tenant state isolation
5-attempt lockout (60 min)
Data residency control
5-tier audit logging

300+

Edge locations

<50ms

P95 latency

25K

Realtime connections

10M

API req / month

9

SDK languages

//Observability

Know exactly what's happening.

17 Usage Metrics

API requests, database queries, cache ops, storage, AI tokens, realtime connections — all tracked automatically, per project.

Anomaly Detection

Automatic abuse scoring (0–100). Score above 80 triggers burst throttle. Score above 100 flags for review.

Full Audit Trail

30-day retention on every user action — entity creation, updates, deletions — with actor, timestamp, and diff.

5-Tier Logging

Edge logs → platform errors → developer traces → analytics events → usage rollups. Five levels, all automatic.

Per-Request Latency

P50 / P95 / P99 tracked per route and gateway API. Error rates broken down by status code.

Webhook Execution Logs

Full request/response bodies logged for every hook call. Status, latency, retry history all visible in dashboard.

Developer Experience

Same DX.
Any scale.

Enterprise teams get the same CLI, SDKs, and type-safe RPC client as the rest of the platform. No new tooling to learn, no lock-in. Just higher limits.

SDK Languages

JS / NodeReactPythonGoFlutterDartPHP

CLI

npx aerostack create
npx aerostack deploy
npx aerostack add <slug>
sdk-example
import Aerostack from '@aerostack/node';

const client = new Aerostack({
  projectSlug: 'my-project',
  apiKey: process.env.AEROSTACK_API_KEY,
});

// Route to any model — OpenAI, Anthropic, Gemini, Azure
const stream = await client.gateway.chat({
  gatewaySlug: 'my-ai-api',
  consumerKey: 'ask_live_xxxxxxxxxxxx',
  messages: [{ role: 'user', content: 'Hello!' }],
  stream: true,
});

for await (const chunk of stream) {
  process.stdout.write(chunk.content ?? '');
}
//Comparison

Enterprise vs Business

Aerostack EnterpriseBusiness ($149/mo)
API requests / moCustom10M
ProjectsCustom50
Realtime connectionsCustom25K
AI tokens / moCustom100M
MCP workspacesUnlimited10
SupportDedicated CSMEmail only
SLA99.99%Best effort
OnboardingDedicated setupSelf-serve
BillingCustom invoiceStripe portal
//Process

How Enterprise works.

1

Talk to Us

Tell us your team size, AI stack, and compliance requirements.

2

Custom Setup

We configure your private MCP workspace, limits, and dedicated support.

3

Ship in Days

Same CLI, same SDKs — enterprise-grade from day one.

Custom pricing starting from Business tier ($149/mo). Volume discounts and annual invoicing available.

View standard plans

Ready to deploy your
AI infrastructure?

Talk to our team about custom limits, private MCP workspaces, and dedicated support.

hello@aerostack.dev