Together Ai
MCP Server Hosted PublicRun open-source LLMs, generate embeddings, create images, and fine-tune models using Together AI's inference platform.
Use with AI AssistantsMCP
Connect Claude, Cursor, or any MCP-compatible client — then call tools directly
① Add This MCP Server
Paste into your AI client config — then all its tools are available instantly.
{
"mcpServers": {
"together-ai": {
"url": "https://mcp.aerostack.dev/s/aerostack/mcp-together-ai",
"headers": {
"Authorization": "Bearer YOUR_AEROSTACK_TOKEN"
}
}
}
}Replace YOUR_AEROSTACK_TOKEN with your API token from the dashboard.
② Call a Tool
Ask your AI assistant to call a specific tool, or send raw JSON-RPC:
Natural Language Prompt
“Use the chat tool to send messages to any together ai chat model (llama, mistral, qwen, etc.) and receive a response. openai-compatible interface”
Using a Workspace?
Add this MCP to your Workspace — your team shares one token, secrets are stored securely, and every AI agent in the workspace can call it without per-user setup.
No documentation provided.
Details
Live Endpoint
https://mcp.aerostack.dev/s/aerostack/mcp-together-ai
Sub-50ms globally · Zero cold start
Publisher
@aerostack
Pre-built functions for the most common MCP tool patterns. Clone, extend, and deploy.