Atla MCP Server
An MCP server implementation providing a standardized interface for LLMs to interact with the Atla API.
atla-ai
README
Atla MCP Server
An MCP server implementation that provides a standardized interface for LLMs to interact with the Atla SDK and use our state-of-the-art evaluation models.
Features
- Evaluate individual responses with Selene 1
- Run batch evaluations with Selene 1
- List available evaluation metrics, create new ones or fetch them by name
Usage
To use the MCP server, you will need an Atla API key. You can find your existing API key here or create a new one here.
Remote Usage
Atla provides a hosted MCP server that can be used by any MCP client. This means that you can use the MCP server without needing to clone the repository and run it locally.
Connecting to the Server
Claude Desktop
For more details on configuring MCP servers in Claude Desktop, refer to the official MCP quickstart guide.
- Add the following to your
claude_desktop_config.json
file:
{
"mcpServers": {
"atla-mcp-server": {
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.atla-ai.com/sse",
"--header",
"Authorization: Bearer ${ATLA_API_KEY}"
],
"env": {
"ATLA_API_KEY": "<your-atla-api-key>"
}
}
}
}
- Restart Claude Desktop to apply the changes.
You should now see options from atla-mcp-server
in the list of available MCP tools.
Cursor
For more details on configuring MCP servers in Cursor, refer to the official documentation.
- Add the following to your
.cursor/mcp.json
file:
{
"mcpServers": {
"atla-mcp-server": {
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.atla-ai.com/sse",
"--header",
"Authorization: Bearer ${ATLA_API_KEY}"
],
"env": {
"ATLA_API_KEY": "<your-atla-api-key>"
}
}
}
}
You should now see atla-mcp-server
in the list of available MCP servers.
OpenAI Agents SDK
For more details on using the OpenAI Agents SDK with MCP servers, refer to the official documentation.
- Install the OpenAI Agents SDK:
pip install openai-agents
- Use the OpenAI Agents SDK to connect to the server:
import os
from agents import Agent
from agents.mcp import MCPServerStdio
async with MCPServerStdio(
params={
"command": "npx",
"args": ["mcp-remote", "https://mcp.atla-ai.com/sse", "--header", "Authorization: Bearer ${ATLA_API_KEY}"],
"env": {"ATLA_API_KEY": os.environ.get("ATLA_API_KEY")}
}
) as atla_mcp_server:
# Create an agent with the Atla evaluation server
agent = Agent(
name="AssistantWithAtlaEval",
instructions="""
You are a helpful assistant. Your goal is to provide high-quality responses to user requests.
You can use the Atla evaluation tool to improve your responses.
""",
mcp_servers=[atla_mcp_server],
model="gpt-4o-mini"
)
Local Usage
Local hosting is the conventional way of interacting with MCP servers. Running the server locally also allows you to extend functionality by adding new tools and resources.
Installation
We recommend using
uv
to manage the Python environment.
- Clone the repository:
git clone https://github.com/yourusername/atla-mcp-server.git
cd atla-mcp-server
- Create and activate a virtual environment:
uv venv
source .venv/bin/activate
- Install dependencies depending on your needs:
# Basic installation
uv pip install -e .
# Installation with development tools (recommended)
uv pip install -e ".[dev]"
pre-commit install
- Add your
ATLA_API_KEY
to your environment:
export ATLA_API_KEY=<your-atla-api-key>
Running the Server
After installation, you can run the server in several ways:
- Using
uv run
(recommended):
uv run atla-mcp-server
- Using Python directly:
python -m atla_mcp_server
- From the repository root:
python src/atla_mcp_server/__main__.py
All methods will start the MCP server with stdio
transport, ready to accept connections from MCP clients.
MCP Inspector
When developing locally, you can also run the MCP Inspector to test and debug the MCP server:
uv run mcp dev src/atla_mcp_server/__main__.py
Connecting to the Server
Once the server is running, you can connect to it using any MCP client.
Claude Desktop
Follow the instructions above, but update your configuration file to use the local server:
{
"mcpServers": {
"atla-mcp-server": {
"command": "/path/to/uv",
"args": [
"--directory",
"/path/to/atla-mcp-server",
"run",
"atla-mcp-server"
],
"env": {
"ATLA_API_KEY": "<your-atla-api-key>"
}
}
}
}
Cursor
Follow the instructions above, but update your configuration file to use the local server:
{
"mcpServers": {
"atla-mcp-server": {
"command": "/path/to/uv",
"args": [
"--directory",
"/path/to/atla-mcp-server",
"run",
"atla-mcp-server"
],
"env": {
"ATLA_API_KEY": "<your-atla-api-key>"
}
}
}
}
OpenAI Agents SDK
Follow the instructions above, but update your configuration to use the local server:
import os
from agents import Agent
from agents.mcp import MCPServerStdio
async with MCPServerStdio(
params={
"command": "uv",
"args": ["run", "--directory", "/path/to/atla-mcp-server", "atla-mcp-server"],
"env": {"ATLA_API_KEY": os.environ.get("ATLA_API_KEY")}
}
) as atla_mcp_server:
...
Contributing
Contributions are welcome! Please see the CONTRIBUTING.md file for details.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.