Aerostack
Aerostack
settings

OpenAI Chat Proxy

DevOps
Updated Mar 6, 2025
GitHub
terminal — aerostack cli
$npx aerostack add aerostack/openai-chat-proxy
smart_toy

Use with AI Assistants

MCP

Connect Claude, Cursor, or any MCP-compatible client — then call this function by slug

① Add MCP Server

Add this once — access all Aerostack functions from your AI tool.

claude_desktop_config.json
{
  "mcpServers": {
    "aerostack": {
      "url": "https://mcp.aerostack.dev",
      "type": "http"
    }
  }
}

② Call this function

Ask your AI to use the call_function tool with this slug:

call_function
call_function({
  slug: "openai-chat-proxy",
  args: {}
})

OpenAI Chat Proxy

Deploy this function to get a free public URL that proxies OpenAI's chat completions API.

Usage

curl https://{your-project}.aerostack.dev/custom/openai-chat-proxy \
  -H "Content-Type: application/json" \
  -d '{"model":"gpt-4","messages":[{"role":"user","content":"Hello!"}]}'

Setup

  1. Open the function editor and replace sk-your-key-here with your OpenAI API key
  2. Click Deploy to Edge
  3. Use the URL above — it's live immediately, no auth required

Connect to a Gateway

For auth, billing, rate limiting, and analytics → connect this function to an AI Gateway.

Metadata

upgradeVersion1.0.0
gavelLicenseMIT
languageLanguagetypescript
cloudProvidercloudflare-worker

Tags

deployed_code

Build and publish your own functions

Write a TypeScript function, deploy it to the edge, and share it with thousands of developers — in minutes.

Similar Functions

View all →