Aerostack
Aerostack
Templates/OpenAI Proxy
🤖
🔧 Developer Tools

OpenAI Proxy

Drop-in proxy for OpenAI chat completions

OpenAI CompatibleStreaming PassthroughCORS HeadersEdge DeployedZero Config

About

A drop-in proxy for OpenAI's chat completions API deployed on the edge. Full streaming passthrough, CORS headers, and zero-config deployment — point any OpenAI SDK client at your proxy URL and it just works.

Deploy free for personal use, or add a Gateway for consumer key auth, rate limiting, and token-based billing to resell access.

API Endpoints

POST/v1/chat/completions
GET/health

How It Works

1

Request Reception

POST /v1/chat/completions — OpenAI-compatible request received at the edge.

2

Auth & Billing

If Gateway is attached, consumer key validated and token quota checked.

3

Proxy Forward

Request forwarded to OpenAI with your API key — streaming fully passed through.

4

Response Relay

OpenAI response streamed back to the client with CORS headers.

Use Cases

💰

API Reselling

Wrap OpenAI access with your own auth and billing to offer AI as a service.

🌐

CORS Proxy

Call OpenAI from browser-side code without exposing your API key.

🚦

Rate Limit Buffer

Add your own rate limiting layer in front of OpenAI to manage burst traffic.

📊

Logging & Analytics

Log all OpenAI requests through your proxy for usage analytics and debugging.

Quick Launch

Opens Aerostack dashboard to deploy this template

What's Included

OpenAI Compatible
Streaming Passthrough
CORS Headers
Edge Deployed
Zero Config
2 API endpoints
Edge deployed

Pipeline

LLM— AI text generation

Billing Model

metered

Pay per token used. Free tier included.

Tags

openaiproxystreamingcors