Build, deploy, and orchestrate AI integrations — plan connections, auto-generate workflows, create private MCPs, and publish to marketplace. 17 tools for end-to-end automation.
Use with AI AssistantsMCP
Connect Claude, Cursor, or any MCP-compatible client — then call tools directly
① Add This MCP Server
Paste into your AI client config — then all its tools are available instantly.
{
"mcpServers": {
"aerostack": {
"url": "https://mcp.aerostack.dev/s/aerostack/mcp-aerostack",
"headers": {
"Authorization": "Bearer YOUR_AEROSTACK_TOKEN"
}
}
}
}Replace YOUR_AEROSTACK_TOKEN with your API token from the dashboard.
② Call a Tool
Ask your AI assistant to call a specific tool, or send raw JSON-RPC:
Natural Language Prompt
“Use the guide tool to get help on aerostack. topics: start, credentials, telegram_setup, discord_setup, whatsapp_setup, slack_setup, llm_keys, workspace_tools, functions, integrations, node:<type>. node types: trigger, llm_call, logic, mcp_tool, send_message, action, loop, code_block, auth_gate, schedule_message, delegate_to_bot, send_proactive, error_handler, parallel. use guide("start") first if new to aerostack”
Using a Workspace?
Add this MCP to your Workspace — your team shares one token, secrets are stored securely, and every AI agent in the workspace can call it without per-user setup.
Turn any AI coding agent into a full-stack Aerostack developer. Plan integrations, build bots, generate workflows, create private MCPs, and publish to marketplace — all through conversation.
Aerostack is a developer platform for building AI-native backends on the edge. This MCP server gives your AI agent (Claude Code, Cursor, Windsurf, or any MCP-compatible tool) the ability to plan and execute end-to-end integrations — from analyzing what services are needed, through building missing connectors, to deploying production workflows.
Instead of clicking through a dashboard or writing boilerplate, just describe what you want: "Connect my Notion to Ocoya — post new pages as social media content every hour" — and your AI agent handles everything.
| Tool | Description |
|---|---|
guide | Contextual help — platform setup, node schemas, credential requirements, runtime APIs. 18+ topics. |
list | List resources: bots, workflows, endpoints, webhooks, functions, workspace tools, templates. |
get | Get full details — config, workflow graph, credential status, deployment state. |
create | Create resources. Bot credentials validated + encrypted. Functions auto-deployed. |
update | Partial updates. Functions auto-redeployed. Workspace secrets encrypted (AES-256-GCM). |
delete | Delete with reference checking — won't delete a workflow used by a bot. |
validate | Pre-flight checks: graph validation, bot config, function code safety. |
test | Execute with test input — per-node execution logs for workflows. |
deploy | Go live: register webhooks, publish workflows, push functions to edge. |
scaffold | Generate complete resource from natural language description. |
| Tool | Description |
|---|---|
plan | The brain — Analyze integration request, search marketplace, detect gaps, return execution plan with credentials + workflow shape. |
capabilities | Get MCP tools grouped by read/write/search, auth requirements, config schema. |
build_spec | Generate MCP server template for AI editors — boilerplate, tool stubs, aerostack.toml, README. |
sdk_docs | SDK reference: MCP templates, function templates, auth patterns, deploy/test guides. |
setup_integration | One-call workspace setup: create workspace + add MCPs + store secrets + verify health. |
generate_workflow | Auto-build workflow from workspace tools: trigger → source → transform → destination. |
publish | Publish private MCP to marketplace with quality gate validation. |
1. plan("connect Notion to Ocoya")
→ Finds Notion MCP ✓, Ocoya not found ✗
→ Returns: credentials needed, workflow shape, next steps
2. build_spec({ service: "ocoya", tools: ["create_post"] })
→ Returns MCP template for your AI editor to implement
3. create({ type: "function", config: { code: "...", visibility: "private" } })
→ Deploys private MCP to edge
4. setup_integration({ name: "Notion → Ocoya", servers: [...], secrets: {...} })
→ Creates workspace, adds MCPs, stores encrypted secrets
5. generate_workflow({ workspace_id: "...", intent: "Post Notion pages to Ocoya" })
→ Auto-builds trigger → read → transform → write pipeline
6. test + deploy → Live!
7. publish({ function_id: "..." })
→ Optional: share your Ocoya MCP on the marketplace
| Variable | Required | Description |
|---|---|---|
AEROSTACK_API_KEY | Yes | Account key (starts with ak_, never expires). Get from aerostack.dev → Settings → CLI Keys. |
Claude Code (.claude/mcp-config.json):
{
"mcpServers": {
"aerostack": {
"url": "https://mcp.aerostack.dev/ws/YOUR_WORKSPACE_SLUG",
"headers": {
"Authorization": "Bearer YOUR_WORKSPACE_TOKEN"
}
}
}
}
Plan an integration:
You: "I want to sync GitHub issues to Linear whenever they're labeled as 'bug'"
Agent: → plan() finds both MCPs available
→ setup_integration() creates workspace with both MCPs
→ generate_workflow() builds: webhook trigger → github.get_issue → transform → linear.create_issue
→ Asks for GitHub token + Linear API key
→ Deploys workflow
Build a missing MCP:
You: "Connect Notion to Ocoya for social media posting"
Agent: → plan() finds Notion ✓, Ocoya ✗
→ build_spec("ocoya") gives template
→ Writes Ocoya MCP implementation
→ Deploys as private MCP
→ Wires workflow
→ Suggests publishing to marketplace
This MCP is deployed privately from the Aerostack monorepo:
# Build + deploy to dev
AEROSTACK_API_KEY=ak_... npm run deploy:dev
# Build + deploy to production
AEROSTACK_API_KEY=ak_... npm run deploy:prod
Source code is NOT in the public aerostack-market repo.
Live Endpoint
https://mcp.aerostack.dev/s/aerostack/mcp-aerostack
Sub-50ms globally · Zero cold start
@aerostack
Pre-built functions for the most common MCP tool patterns. Clone, extend, and deploy.
The Aerostack MCP gives Claude 22 tools covering specs, docs, integrations. Claude can read, create, update, and manage Aerostack data directly from a conversation.
Claude can read specs, docs, integrations from your Aerostack account through the MCP. All 22 tools respect your Aerostack account permissions — Claude only accesses data you're authorized to see.
Yes. The Aerostack MCP uses the open Model Context Protocol standard, so it works in Claude, Cursor, Windsurf, and any other MCP-compatible AI tool. All 22 Aerostack tools are available everywhere you connect it — install once on Aerostack.
The Aerostack MCP exposes Aerostack's AI capabilities to Claude — including generation, transcription, or embedding tools. Claude can chain these with other MCPs to build multi-step AI workflows.