Run any AI model on Replicate — image generation, video, audio, language models — and manage predictions, deployments, and your model library.
Use with AI AssistantsMCP
Connect Claude, Cursor, or any MCP-compatible client — then call tools directly
① Add This MCP Server
Paste into your AI client config — then all its tools are available instantly.
{
"mcpServers": {
"replicate": {
"url": "https://mcp.aerostack.dev/s/aerostack/mcp-replicate",
"headers": {
"Authorization": "Bearer YOUR_AEROSTACK_TOKEN"
}
}
}
}Replace YOUR_AEROSTACK_TOKEN with your API token from the dashboard.
② Call a Tool
Ask your AI assistant to call a specific tool, or send raw JSON-RPC:
Natural Language Prompt
“Use the _ping tool to verify replicate credentials by calling a lightweight read endpoint. used internally by aerostack to validate credentials”
Using a Workspace?
Add this MCP to your Workspace — your team shares one token, secrets are stored securely, and every AI agent in the workspace can call it without per-user setup.
Run any AI model on Replicate — image generation, video, audio, language models — and manage predictions, deployments, and your model library.
Live endpoint: https://mcp.aerostack.dev/s/aerostack/mcp-replicate
This MCP server gives AI agents access to Replicate via 12 tools. Connect it to any Aerostack workspace and your agents can interact with Replicate directly.
| Tool | Description |
|---|---|
run_model | Run a Replicate model with a specific version and inputs. Returns prediction output or a prediction ID for async polling |
get_prediction | Get the current status and output of a prediction by its ID |
cancel_prediction | Cancel a prediction that is currently queued or in progress |
list_predictions | List your recent predictions with status, model, and output URLs |
get_model | Get details about a Replicate model: description, visibility, run count, and latest version |
list_model_versions | List all available versions of a Replicate model with their creation dates and OpenAPI schemas |
get_model_version | Get the OpenAPI input/output schema for a specific model version |
search_models | Search Replicate public models by keyword, returning name, description, run count, and latest version |
list_deployments | List your Replicate deployments (dedicated hosted model instances) |
create_deployment_prediction | Run a prediction on a specific named deployment (useful for consistent latency with dedicated compute) |
get_account | Get your Replicate account information: username, name, and account type |
create_model | Create a new model on Replicate with a specified owner, name, visibility, and hardware |
| Variable | Required | Description |
|---|---|---|
REPLICATE_API_TOKEN | Yes | Your Replicate API token — found at replicate.com/account/api-tokens |
Add the following secrets under Project → Secrets:
REPLICATE_API_TOKENOnce added, every AI agent in your workspace can use Replicate tools automatically.
curl -X POST https://mcp.aerostack.dev/s/aerostack/mcp-replicate \
-H 'Content-Type: application/json' \
-H 'X-Mcp-Secret-REPLICATE-API-TOKEN: your-replicate-api-token' \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"run_model","arguments":{}}}'
MIT
Live Endpoint
https://mcp.aerostack.dev/s/aerostack/mcp-replicate
Sub-50ms globally · Zero cold start
@aerostack
Pre-built functions for the most common MCP tool patterns. Clone, extend, and deploy.
by @aerostack
Discover and invoke any MCP, Function, or Skill published to the Aerostack marketplace — the universal AI capability hub.
by @aerostack
Search indexes, manage records, browse data, and configure ranking in Algolia — AI-native instant search access.
by @aerostack
Query documents, run AQL, traverse graphs, and manage collections in your ArangoDB database — AI-native multi-model database access.
by @aerostack
Post, schedule, and analyze social media across 13 platforms — Facebook, Instagram, X, LinkedIn, TikTok, Bluesky, Threads, Reddit, Pinterest, YouTube, Telegram, Snapchat, Google Business.
by @aerostack
Manage projects, to-dos, messages, schedules, and campfire chats in Basecamp — AI-native project management.
by @aerostack
Run SQL queries, list datasets and tables, inspect schemas, and export results from Google BigQuery — AI-native data warehouse access.
Yes. The Replicate MCP includes a create_deployment_prediction tool that lets Claude create deployment predictions in your Replicate account from a plain-English prompt. You can also update and delete deployment predictions — no Replicate UI needed.
Yes. The Replicate MCP has search and list tools for models. Ask Claude to find models by any criteria and it runs the Replicate API query for you, returning results directly in the conversation.
Yes. The Replicate MCP uses the open Model Context Protocol standard, so it works in Claude, Cursor, Windsurf, and any other MCP-compatible AI tool. All 12 Replicate tools are available everywhere you connect it — install once on Aerostack.
Yes. The Replicate MCP includes run and trigger tools that let Claude execute Replicate actions on demand. This is useful for automating repetitive Replicate tasks directly from a Claude or Cursor session.