Aerostack
Aerostack
Fullstack Edge Functions

Every function is a
complete backend.

Database, cache, queue, AI, vector search, and storage — native bindings in every function. Not HTTP calls. Zero latency. Zero config.

Native bindingsZero configGlobal edge
//The Problem

Traditional serverless = function + 6 external services.

Traditional

Function runtime
Database (HTTP)+latency
Cache (HTTP)+latency
Queue (HTTP)+latency
AI provider (HTTP)+latency
Object storage (HTTP)+latency
Vector DB (HTTP)+latency
7 services. 6 HTTP round-trips. 6 sets of credentials.

Aerostack

env.DBSQL database
env.CACHEKey-value cache
env.QUEUEBackground jobs
env.AILLM + embeddings
env.VECTORIZEVector search
env.STORAGEObject storage
1 function. All built in. Zero HTTP overhead. Zero config.
//Built-in Bindings

Six services. Zero setup.

env.DB

Database

SQL queries, batch operations, prepared statements, Postgres-compatible routing.

env.CACHE

Cache

Key-value storage with TTL, atomic counters, list operations. In-memory speed at the edge.

env.QUEUE

Queue

Background jobs with retries, status tracking, and dead-letter handling. Fire and forget.

env.AI

AI

Multi-provider LLM inference, embeddings, streaming responses. OpenAI, Anthropic, and more.

env.VECTORIZE

Vector Search

Semantic similarity search, RAG pipelines, automatic embedding generation.

env.STORAGE

Storage

Object storage with CDN delivery. Upload, serve, and manage files at the edge.

//How It Looks

One function. Full backend.

index.ts
export default {
async fetch(request: Request, env: Env) {
// Query database — native binding, not HTTP
const orders = await env.DB.prepare(
'SELECT * FROM orders WHERE user_id = ?'
).bind(userId).all()
// Cache the result with TTL
await env.CACHE.put('recent-orders', JSON.stringify(orders),
{ expirationTtl: 3600 })
// Queue a background notification
await env.QUEUE.send({ type: 'order-summary', userId })
// AI analysis
const summary = await env.AI.run('...', { messages: [...] })
return Response.json({ orders, summary })
}
}

Four bindings. One function. No infrastructure to manage.

//Use Cases

Build anything. All at the edge.

Custom API Backend

Build REST or GraphQL APIs with database, auth, and caching built in. Ship a complete backend in a single file.

Bot Intelligence Layer

Power AI bots with persistent memory, RAG over your data, and real-time tool execution at the edge.

RAG Pipeline

Ingest documents, generate embeddings, store vectors, and serve semantic search — all from one function.

Real-time Data Pipeline

Process webhooks, transform data, queue background jobs, and write results to database — with zero infrastructure.

//Foundation Layer

Functions power everything.

Edge functions are the foundation layer. Every MCP server, skill, bot workflow, and agent endpoint runs on top of the same fullstack runtime.

Edge FunctionDB + Cache + Queue + AI + Vector + Storage

Start building fullstack edge functions.

Database, cache, queue, AI, vector search, and storage — in every function, on every deploy, at every edge location.