Search models, datasets, and Spaces on Hugging Face — browse trending AI models, check downloads, and run inference from any agent.
Use with AI AssistantsMCP
Connect Claude, Cursor, or any MCP-compatible client — then call tools directly
① Add This MCP Server
Paste into your AI client config — then all its tools are available instantly.
{
"mcpServers": {
"huggingface": {
"url": "https://mcp.aerostack.dev/s/aerostack/mcp-huggingface",
"headers": {
"Authorization": "Bearer YOUR_AEROSTACK_TOKEN"
}
}
}
}Replace YOUR_AEROSTACK_TOKEN with your API token from the dashboard.
② Call a Tool
Ask your AI assistant to call a specific tool, or send raw JSON-RPC:
Natural Language Prompt
“Use the _ping tool to verify hugging face connectivity by fetching the authenticated user. used internally by aerostack to validate credentials”
Using a Workspace?
Add this MCP to your Workspace — your team shares one token, secrets are stored securely, and every AI agent in the workspace can call it without per-user setup.
Search models, datasets, and Spaces on Hugging Face — browse trending AI models, check downloads, and run inference from any agent.
Give your AI agents access to the largest open-source AI model hub. Search 900K+ models and 200K+ datasets, inspect model cards and configs, browse Spaces, list repository files, and run serverless inference — all through natural language.
Live endpoint: https://mcp.aerostack.dev/s/aerostack/mcp-huggingface
| Tool | Description |
|---|---|
_ping | Verify HF connectivity and show authenticated user |
search_models | Search models by keyword, task, library, author |
get_model | Get model details — tags, config, downloads, model card |
search_datasets | Search datasets by keyword or author |
get_dataset | Get dataset details — card, splits, features, downloads |
search_spaces | Search Spaces (apps) by keyword |
run_inference | Run inference on a model via serverless Inference API |
list_model_files | List files in a model repo with sizes and LFS info |
| Variable | Required | Description | How to Get |
|---|---|---|---|
HUGGINGFACE_TOKEN | Yes | Hugging Face User Access Token with read permissions | huggingface.co → Settings → Access Tokens → New token → select "Read" or "Fine-grained" |
For inference: Your token needs
Inferencepermission. Some models (gated) require accepting terms on the model page first.
HUGGINGFACE_TOKEN under Project → Secrets"Search for the most popular text-generation models"
"Show me details about meta-llama/Llama-3-8B-Instruct"
"Find datasets related to code generation"
"What are the trending Spaces right now?"
"Run inference on meta-llama/Llama-3-8B-Instruct: 'Explain quantum computing in one sentence'"
"List the files in the stabilityai/stable-diffusion-xl-base-1.0 model"
curl -X POST https://mcp.aerostack.dev/s/aerostack/mcp-huggingface \
-H 'Content-Type: application/json' \
-H 'X-Mcp-Secret-HUGGINGFACE-TOKEN: hf_xxxxxxxxxxxx' \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"search_models","arguments":{"query":"text-generation","sort":"downloads","limit":10}}}'
MIT
Live Endpoint
https://mcp.aerostack.dev/s/aerostack/mcp-huggingface
Sub-50ms globally · Zero cold start
@aerostack
Pre-built functions for the most common MCP tool patterns. Clone, extend, and deploy.
by @aerostack
Discover and invoke any MCP, Function, or Skill published to the Aerostack marketplace — the universal AI capability hub.
by @aerostack
Search indexes, manage records, browse data, and configure ranking in Algolia — AI-native instant search access.
by @aerostack
Query documents, run AQL, traverse graphs, and manage collections in your ArangoDB database — AI-native multi-model database access.
by @aerostack
Post, schedule, and analyze social media across 13 platforms — Facebook, Instagram, X, LinkedIn, TikTok, Bluesky, Threads, Reddit, Pinterest, YouTube, Telegram, Snapchat, Google Business.
by @aerostack
Manage projects, to-dos, messages, schedules, and campfire chats in Basecamp — AI-native project management.
by @aerostack
Run SQL queries, list datasets and tables, inspect schemas, and export results from Google BigQuery — AI-native data warehouse access.
Yes. The run_inference tool sends input to any public or private Hugging Face Inference API endpoint. Specify the model ID and payload; Claude handles the request and returns the model output.
Use search_models with filters like task type or library. Claude returns ranked results with download counts so you can compare options before calling get_model for full metadata.
It lists the file tree for a repo (safetensors, config.json, tokenizer files, etc.) with sizes and commit SHAs — useful for auditing what a model contains before pulling it into a pipeline.
Yes. Chain search_datasets (sorted by trending/downloads) with your data pipeline tool. Claude can poll periodically and pass the dataset ID to a downstream ingestion workflow.