atom-mcp-server
Global price benchmarking for AI inference across 2,600+ SKUs from 47 vendors. Query live pricing, market indexes, and model specs via 8 tools. Free tier available.
README
<p align="center"> <img src="https://raw.githubusercontent.com/StamatiosKanellakis/A7OM/main/ATOM_Logo_Gray.png" alt="ATOM" width="200" /> </p>
<h1 align="center">ATOM MCP Server</h1>
<p align="center"> <strong>AI Inference Pricing Intelligence — delivered as a native tool for AI agents.</strong><br/> 1,600+ SKUs · 40+ vendors · 6 modalities · 4 channels · 14 AIPI indexes · Updated weekly </p>
<p align="center"> <a href="https://a7om.com">Website</a> · <a href="https://a7om.com/mcp">ATOM MCP Pro</a> · <a href="https://smithery.ai/server/@a7om/atom-mcp-server">Smithery</a> </p>
What Is This?
ATOM MCP Server lets any MCP-compatible AI agent (Claude, GPT, Cursor, Windsurf, VS Code Copilot) query real-time AI inference pricing data programmatically. Think of it as the Bloomberg Terminal for AI pricing, accessible via the Model Context Protocol.
Ask your AI assistant a question like "What's the cheapest way to run GPT-4o?" and it calls ATOM's tools behind the scenes, returning a data-backed answer from 1,600+ pricing SKUs across 40+ vendors globally.
Built by ATOM (A7OM) — the world's first methodological inference pricing index.
AIPI Indexes
14 benchmark indexes across four categories:
| Category | Indexes | What It Answers |
|---|---|---|
| Modality | TXT, MML, IMG, AUD, VID, VOC | What does this type of inference cost? |
| Channel | DEV, CLD, PLT, NCL | Where should you buy — direct, marketplace, platform, or neocloud? |
| Tier | FTR, BDG, RSN | What's the premium for flagship vs budget vs reasoning? |
| Special | OSS | How much cheaper is open-source inference across all channels? |
All indexes are global (GLB), calculated weekly using chained matched-model methodology to eliminate composition bias.
Tools
| Tool | Tier | Description |
|---|---|---|
list_vendors |
Free | All tracked vendors with country, region, channel type, and pricing page URLs |
get_kpis |
Free | 6 market KPIs: output premium, caching savings, open-source advantage, context cost curve, caching availability, size spread |
get_index_benchmarks |
Free | AIPI price benchmarks across 14 indexes — modality, channel, tier, and licensing |
get_market_stats |
Tiered | Aggregate market intelligence: medians, quartiles, distributions, modality breakdown |
search_models |
Tiered | Multi-filter search: modality, vendor, creator, open-source, price range, context window, parameters |
get_model_detail |
Tiered | Full specs + pricing across all vendors for a single model |
compare_prices |
Tiered | Cross-vendor price comparison for a model or model family |
get_vendor_catalog |
Tiered | Complete catalog for a specific vendor: all models, modalities, and pricing |
Pricing Tiers
| ATOM MCP (Free) | ATOM MCP Pro ($49/mo) | |
|---|---|---|
| Vendors, KPIs, AIPI indexes | ✅ Full data | ✅ Full data |
| Market stats | Aggregates only | + Vendor-level breakdown |
| Model search & comparison | Counts + price ranges | Full granular SKU data |
| Model detail | Specs only | + Per-vendor pricing |
| Vendor catalog | Summary only | Full SKU listing |
Free tier (no API key): Enough to understand the market — counts, ranges, distributions, benchmarks.
ATOM MCP Pro ($49/mo): Full granular data — every vendor, model, price, and spec. → a7om.com/mcp
Quick Start
Option 1: Remote URL — Claude.ai / Claude Desktop (recommended)
No install required. Connect directly to ATOM's hosted server:
Claude.ai (web): Settings → Connectors → Add custom connector
Name: ATOM Pricing Intelligence
URL: https://atom-mcp-server-production.up.railway.app/mcp
Claude Desktop: Settings → Developer → Edit Config
{
"mcpServers": {
"atom-pricing": {
"url": "https://atom-mcp-server-production.up.railway.app/mcp"
}
}
}
Note: Remote URL support requires a recent Claude Desktop version. If it doesn't work, use the npx method below.
Claude Desktop (via npx proxy):
{
"mcpServers": {
"atom-pricing": {
"command": "npx",
"args": ["mcp-remote", "https://atom-mcp-server-production.up.railway.app/mcp"]
}
}
}
Option 2: Local (stdio) — for Cursor, Windsurf, etc.
git clone https://github.com/A7OM-AI/atom-mcp-server.git
cd atom-mcp-server
npm install && npm run build
Add to your MCP client config:
{
"mcpServers": {
"atom-pricing": {
"command": "node",
"args": ["/path/to/atom-mcp-server/dist/index.js"],
"env": {
"SUPABASE_URL": "https://jonncmzxvxzwyaznokba.supabase.co",
"SUPABASE_ANON_KEY": "your-anon-key"
}
}
}
}
Option 3: Deploy your own (Railway)
Set environment variables in Railway dashboard:
SUPABASE_URLSUPABASE_ANON_KEYATOM_API_KEYS(comma-separated, for paid tier validation)TRANSPORT=http
Example Queries
Once connected, just ask your AI assistant naturally:
- "What's the cheapest way to run GPT-4o?"
- "Compare Claude Sonnet 4.5 pricing across all vendors"
- "Find open-source text models under $0.50 per million tokens"
- "Show me Google's full model catalog"
- "What are the AIPI benchmark prices for text inference?"
- "How do neocloud prices compare to cloud marketplaces?"
- "How much cheaper is open-source inference?"
- "Give me a market overview of AI inference pricing"
- "What are the key market KPIs for AI inference?"
Environment Variables
| Variable | Required | Description |
|---|---|---|
SUPABASE_URL |
Yes | Supabase project URL |
SUPABASE_ANON_KEY |
Yes | Supabase anonymous/public key |
ATOM_API_KEYS |
No | Comma-separated valid API keys for paid tier |
TRANSPORT |
No | stdio (default) or http |
PORT |
No | HTTP port (default 3000) |
Tech Stack
- TypeScript / Node.js
- MCP SDK (
@modelcontextprotocol/sdk) - Supabase (PostgreSQL) via REST API
- Express (HTTP transport)
- Zod (schema validation)
About ATOM
ATOM tracks 1,600+ AI inference pricing SKUs from 40+ vendors globally through the AIPI (ATOM Inference Price Index) system — the first methodological price benchmark for AI inference. 14 indexes span modality, channel, tier, and licensing dimensions, updated weekly using chained matched-model methodology to eliminate composition bias.
Vendors are classified across four distribution channels: Model Developers (direct API), Cloud Marketplaces (AWS Bedrock, Google Vertex, Azure), Inference Platforms (DeepInfra, Fireworks, Together AI), and Neoclouds (Groq, Cerebras).
Products: ATOM MCP · ATOM Terminal · ATOM Feed
License
MIT
<p align="center"><strong>ATOM</strong> — <em>The Global Price Benchmark for AI Inference.</em></p>
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.