settings
OpenAI Chat Proxy
DevOpsterminal — aerostack cli
$
npx aerostack add aerostack/openai-chat-proxysmart_toy
Use with AI Assistants
MCPConnect Claude, Cursor, or any MCP-compatible client — then call this function by slug
① Add MCP Server
Add this once — access all Aerostack functions from your AI tool.
claude_desktop_config.json
{
"mcpServers": {
"aerostack": {
"url": "https://mcp.aerostack.dev",
"type": "http"
}
}
}② Call this function
Ask your AI to use the call_function tool with this slug:
call_function
call_function({
slug: "openai-chat-proxy",
args: {}
})OpenAI Chat Proxy
Deploy this function to get a free public URL that proxies OpenAI's chat completions API.
Usage
curl https://{your-project}.aerostack.dev/custom/openai-chat-proxy \
-H "Content-Type: application/json" \
-d '{"model":"gpt-4","messages":[{"role":"user","content":"Hello!"}]}'
Setup
- Open the function editor and replace
sk-your-key-herewith your OpenAI API key - Click Deploy to Edge
- Use the URL above — it's live immediately, no auth required
Connect to a Gateway
For auth, billing, rate limiting, and analytics → connect this function to an AI Gateway.
deployed_code
Build and publish your own functions
Write a TypeScript function, deploy it to the edge, and share it with thousands of developers — in minutes.