Aerostack
Aerostack
//MCP Infrastructure Platform

Your AI's toolbox.
One URL. Every tool.

Create a workspace. Add MCP servers and skills. Get one authenticated gateway URL. Namespacing, secrets, team access — automatic.

MCP servers + Skills + FunctionsOpenAI adapterGemini adapter90% revenue to creators
mcp.json — one entry, all your MCPs
{
"mcpServers": {
"my-team": {
"url": "https://api.aerostack.ai/gateway/ws/acme-dev",
"headers": { "Authorization": "Bearer mwt_..." }
}
}
}

← replaces 20 separate MCP configs

47 tools available8 MCPs mergedauto-namespaced
//Architecture

The AI thinks it's one server.
It's actually all of them.

One workspace aggregates MCP servers + Skills + Edge Functions. The AI model sees one server with all tools auto-namespaced.

ClaudeChatGPTGeminiAny AI Model(Sees 1 server)1 connectionMerged responseGateway(One URL)gateway.aerostack.dev/ws/acmeMCPgithub-mcpMCPpostgres-mcpMCPslack-mcpSkillcode-reviewSkillsummarizeFunctioncustom-authParallel fan-outAuto-namespacedAES-256 secrets

MCP Servers

Community or self-hosted MCP servers. Auto-namespaced in your workspace with `{server}__{tool}` naming.

Skills

Public, private, or team-scoped cross-model skills. Works with Claude, OpenAI, and Gemini out of the box.

Edge Functions

Function-backed skills with dynamic TypeScript logic. Runs at 300+ edge locations, sub-50ms globally.

//Cross-LLM Adapters

Native format for every AI.

One workspace URL. Three auto-generated formats. Zero adapter code.

MCP Native

SSE · stdio
// mcp.json
{
  "mcpServers": {
    "workspace": {
      "url": "https://gateway.aerostack.dev/ws/acme",
      "headers": {
        "Authorization": "Bearer mwt_..."
      }
    }
  }
}
Claude DesktopCursorVS CodeAny MCP client

OpenAI Format

/openai-tools
GET /ws/acme/openai-tools
// Auto-generated
[
  {
    "type": "function",
    "function": {
      "name": "github__search_repos",
      "description": "Search GitHub repos",
      "parameters": { ... }
    }
  }
]
OpenAI SDKLangChainCustom agents

Gemini Format

/gemini-tools
GET /ws/acme/gemini-tools
// Auto-generated
{
  "functionDeclarations": [
    {
      "name": "github__search_repos",
      "description": "Search GitHub repos",
      "parameters": { ... }
    }
  ]
}
Gemini APIVertex AIGoogle ADK
//Hosting

Host it. Or bring your own.

Stop running MCP servers on your laptop. Deploy to the edge — your colleague, your team, and any agent worldwide connects to the same hosted URL.

01

You build

  • MCP server using
  • @modelcontextprotocol/sdk
  • Define your tools.
  • Your logic. Your data.
02

Aerostack deploys

  • Global edge network
  • 300+ locations worldwide
  • Zero cold start
  • Auto-scaled globally
03

Anyone connects

  • Cursor
  • Claude Desktop
  • VS Code Copilot
  • Any custom agent

Your hosted MCP URL — paste into any AI client

https://your-server.aerostack.workers.dev/mcp

Sub-50ms globally · Zero cold start · Auto-scaled

Then add it to your workspace

Your hosted server becomes part of your one-URL stack. Team members get access with one token refresh.

//Observability

See every tool call.
Debug in real time.

Every tool invocation is logged — method, latency, status, model, workspace. Stream live or query historical data. Rate limiting and quota enforcement built in.

5

Servers in workspace

120 req/min

Rate limit

Per plan

Quota enforced

Per-tool call logs with latency and status
Live tail — see requests as they happen
Rate limiting per workspace, per consumer key
Quota enforcement with grace period
Live
Tool Calls
TimeToolLatencyStatus
14:32:01.012github__search_repos42ms200
14:32:01.318summarize__text127ms200
14:32:02.044— workspace.list8ms200
14:32:02.891postgres-mcp__query63ms200
14:32:03.102custom-auth__verify19ms200
14:32:03.779slack-mcp__send_msg88ms200
Streaming from acme-workspace

Need private MCPs for your team?

Create a private workspace — share MCPs with your whole team via one URL. Full access control and observability included.

See Private MCP