List buckets, upload, download, delete, and manage objects in Amazon S3 — AI-native cloud storage access for any agent.
Use with AI AssistantsMCP
Connect Claude, Cursor, or any MCP-compatible client — then call tools directly
① Add This MCP Server
Paste into your AI client config — then all its tools are available instantly.
{
"mcpServers": {
"aws-s3": {
"url": "https://mcp.aerostack.dev/s/aerostack/mcp-aws-s3",
"headers": {
"Authorization": "Bearer YOUR_AEROSTACK_TOKEN"
}
}
}
}Replace YOUR_AEROSTACK_TOKEN with your API token from the dashboard.
② Call a Tool
Ask your AI assistant to call a specific tool, or send raw JSON-RPC:
Natural Language Prompt
“Use the _ping tool to verify aws s3 connectivity by listing buckets. used internally by aerostack to validate credentials”
Using a Workspace?
Add this MCP to your Workspace — your team shares one token, secrets are stored securely, and every AI agent in the workspace can call it without per-user setup.
List buckets, upload, download, delete, and manage objects in Amazon S3 — AI-native cloud storage access for any agent.
Live endpoint: https://mcp.aerostack.dev/s/aerostack/mcp-aws-s3
This MCP server gives AI agents access to AWS S3 via 9 tools. Connect it to any Aerostack workspace and your agents can interact with AWS S3 directly.
| Tool | Description |
|---|---|
list_buckets | List all S3 buckets in the AWS account with name, region, and creation date |
list_objects | List objects in an S3 bucket with optional prefix filter, delimiter for folder-like navigation, and pagination |
get_object | Download an object from S3 and return its content as text. For binary files, returns a pre-signed download URL instead. |
put_object | Upload a text or JSON object to S3 with optional content type and metadata |
delete_object | Delete an object from an S3 bucket by key |
copy_object | Copy an object from one location to another within or across S3 buckets |
head_object | Get metadata for an S3 object without downloading it — size, content type, last modified, ETag, and custom metadata |
presign_url | Generate a pre-signed URL for temporary access to an S3 object (upload or download) without sharing credentials |
create_bucket | Create a new S3 bucket in the configured AWS region |
| Variable | Required | Description |
|---|---|---|
AWS_ACCESS_KEY_ID | Yes | See provider documentation |
AWS_SECRET_ACCESS_KEY | Yes | Secret key from the provider's developer console |
AWS_REGION | Yes | See provider documentation |
Add the following secrets under Project → Secrets:
AWS_ACCESS_KEY_IDAWS_SECRET_ACCESS_KEYAWS_REGIONOnce added, every AI agent in your workspace can use AWS S3 tools automatically.
curl -X POST https://mcp.aerostack.dev/s/aerostack/mcp-aws-s3 \
-H 'Content-Type: application/json' \
-H 'X-Mcp-Secret-AWS-ACCESS-KEY-ID: your-aws-access-key-id' \
-H 'X-Mcp-Secret-AWS-SECRET-ACCESS-KEY: your-aws-secret-access-key' \
-H 'X-Mcp-Secret-AWS-REGION: your-aws-region' \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"list_buckets","arguments":{}}}'
MIT
Live Endpoint
https://mcp.aerostack.dev/s/aerostack/mcp-aws-s3
Sub-50ms globally · Zero cold start
@aerostack
Pre-built functions for the most common MCP tool patterns. Clone, extend, and deploy.
by @aerostack
Tasks, projects, sections, comments, teams, custom fields via Asana's official MCP
by @aerostack
Jira issues, sprints, Confluence pages via Atlassian's official MCP
by @aerostack
Cloudflare Workers, KV, R2, D1, Pages, DNS via Cloudflare's official MCP
by @aerostack
Dashboards, monitors, logs, metrics, incidents, SLOs, APM traces via Datadog's official MCP
by @aerostack
Files, frames, components, comments, variables via Figma's official MCP
by @aerostack
GitHub repos, PRs, issues, branches, code search via GitHub's official hosted MCP
Yes. presign_url accepts bucket, key, and expiry seconds. Claude returns the signed URL directly — no credentials are exposed to the client. Default expiry is typically 3600 seconds.
Call head_object first; it returns content-length, ETag, and last-modified without fetching the body. Claude uses this to confirm the object exists and check size before issuing get_object.
Yes. copy_object takes source bucket+key and destination bucket+key. Both must be accessible with the configured IAM credentials. Claude can use this to archive or promote files across environments.
s3:ListAllMyBuckets, s3:GetObject, s3:PutObject, s3:DeleteObject, s3:CopyObject, and s3:CreateBucket scoped to target buckets. presign_url also requires s3:GetObject on the relevant bucket.