langfuse-mcp

langfuse-mcp

MCP server for Langfuse observability. Query traces, debug exceptions, analyze sessions, and manage prompts and datasets for your LLM applications.

Category
Visit Server

README

Langfuse MCP Server

PyPI Python 3.10–3.13 License: MIT

Model Context Protocol server for Langfuse observability. Query traces, debug errors, analyze sessions, manage prompts.

Why langfuse-mcp?

Comparison with official Langfuse MCP (as of Jan 2026):

langfuse-mcp Official
Traces & Observations Yes No
Sessions & Users Yes No
Exception Tracking Yes No
Prompt Management Yes Yes
Dataset Management Yes No
Selective Tool Loading Yes No

This project provides a full observability toolkit — traces, observations, sessions, exceptions, and prompts — while the official MCP focuses on prompt management.

Quick Start

Requires uv (for uvx).

Get credentials from Langfuse Cloud → Settings → API Keys. If self-hosted, use your instance URL for LANGFUSE_HOST.

# Claude Code (project-scoped, shared via .mcp.json)
claude mcp add \
  --scope project \
  --env LANGFUSE_PUBLIC_KEY=pk-... \
  --env LANGFUSE_SECRET_KEY=sk-... \
  --env LANGFUSE_HOST=https://cloud.langfuse.com \
  langfuse -- uvx --python 3.11 langfuse-mcp

# Codex CLI (user-scoped, stored in ~/.codex/config.toml)
codex mcp add langfuse \
  --env LANGFUSE_PUBLIC_KEY=pk-... \
  --env LANGFUSE_SECRET_KEY=sk-... \
  --env LANGFUSE_HOST=https://cloud.langfuse.com \
  -- uvx --python 3.11 langfuse-mcp

Restart your CLI, then verify with /mcp (Claude Code) or codex mcp list (Codex).

Tools (25 total)

Category Tools
Traces fetch_traces, fetch_trace
Observations fetch_observations, fetch_observation
Sessions fetch_sessions, get_session_details, get_user_sessions
Exceptions find_exceptions, find_exceptions_in_file, get_exception_details, get_error_count
Prompts list_prompts, get_prompt, get_prompt_unresolved, create_text_prompt, create_chat_prompt, update_prompt_labels
Datasets list_datasets, get_dataset, list_dataset_items, get_dataset_item, create_dataset, create_dataset_item, delete_dataset_item
Schema get_data_schema

Dataset Item Updates (Upsert)

Langfuse uses upsert for dataset items. To edit an existing item, call create_dataset_item with item_id. If the ID exists, it updates; otherwise it creates a new item.

create_dataset_item(
  dataset_name="qa-test-cases",
  item_id="item_123",
  input={"question": "What is 2+2?"},
  expected_output={"answer": "4"}
)

Skill

This project includes a skill with debugging playbooks.

Via skild.sh (registry-based):

npx skild install @avivsinai/langfuse

Via skills.sh (GitHub-based):

npx skills add avivsinai/langfuse-mcp

Manual install:

cp -r skills/langfuse ~/.claude/skills/   # Claude Code
cp -r skills/langfuse ~/.codex/skills/    # Codex CLI

Try asking: "help me debug langfuse traces"

See skills/langfuse/SKILL.md for full documentation.

Selective Tool Loading

Load only the tool groups you need to reduce token overhead:

langfuse-mcp --tools traces,prompts

Available groups: traces, observations, sessions, exceptions, prompts, datasets, schema

Read-Only Mode

Disable all write operations for safer read-only access:

langfuse-mcp --read-only
# Or via environment variable
LANGFUSE_MCP_READ_ONLY=true langfuse-mcp

This disables: create_text_prompt, create_chat_prompt, update_prompt_labels, create_dataset, create_dataset_item, delete_dataset_item

Other Clients

Cursor

Create .cursor/mcp.json in your project (or ~/.cursor/mcp.json for global):

{
  "mcpServers": {
    "langfuse": {
      "command": "uvx",
      "args": ["--python", "3.11", "langfuse-mcp"],
      "env": {
        "LANGFUSE_PUBLIC_KEY": "pk-...",
        "LANGFUSE_SECRET_KEY": "sk-...",
        "LANGFUSE_HOST": "https://cloud.langfuse.com"
      }
    }
  }
}

Docker

docker run --rm -i \
  -e LANGFUSE_PUBLIC_KEY=pk-... \
  -e LANGFUSE_SECRET_KEY=sk-... \
  -e LANGFUSE_HOST=https://cloud.langfuse.com \
  ghcr.io/avivsinai/langfuse-mcp:latest

Development

uv venv --python 3.11 .venv && source .venv/bin/activate
uv pip install -e ".[dev]"
pytest

License

MIT

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured