langchain-box-mcp-adapter
This sample implements the Langchain MCP adapter to the Box MCP server.
box-community
README
langchain-box-mcp-adapter
This sample project implements the Langchain MCP adapter to the Box MCP server. It demonstrates how to integrate Langchain with a Box MCP server using tools and agents.
Features
- Langchain Integration: Utilizes Langchain's
ChatOpenAI
model for AI interactions. - MCP Server Communication: Connects to the Box MCP server using
stdio
transport. - Tool Loading: Dynamically loads tools from the MCP server.
- Agent Creation: Creates a React-style agent for handling user prompts and tool interactions.
- Rich Console Output: Provides a user-friendly console interface with markdown rendering and typewriter effects.
Requirements
- Python 3.13 or higher
- Dependencies listed in
pyproject.toml
:langchain-mcp-adapters>=0.0.8
langchain-openai>=0.3.12
langgraph>=0.3.29
rich>=14.0.0
Setup
-
Clone the repository:
git clone <repository-url> cd langchain-box-mcp-adapter
-
Install dependencies:
uv sync
-
Create a
.env
file in the root of the project and fill in the information.
LANGSMITH_TRACING = "true"
LANGSMITH_API_KEY =
OPENAI_API_KEY =
BOX_CLIENT_ID = ""
BOX_CLIENT_SECRET = ""
BOX_SUBJECT_TYPE = "user"
BOX_SUBJECT_ID = ""
-
Ensure the MCP server is set up and accessible at the specified path in the project.
-
Update the StdioServerParameters in src/simple_client.py or src/graph.py with the correct path to your MCP server script.
server_params = StdioServerParameters(
command="uv",
args=[
"--directory",
"/your/absolute/path/to/the/mcp/server/mcp-server-box",
"run",
"src/mcp_server_box.py",
],
)
Usage
Running the Simple Client
To run the simple client:
uv run src/simple_client.py
This will start a console-based application where you can interact with the AI agent. Enter prompts, and the agent will respond using tools and AI capabilities.
Running the Graph-Based Agent (LangGraph)
The graph-based agent can be used by invoking the make_graph function in src/graph.py. This is useful for more complex workflows.
uv run langgraph dev --config src/langgraph.json
You should see something like:
Project Structure
- src/simple_client.py: Main entry point for the simple client.
- src/graph.py: Contains the graph-based agent setup.
- src/console_utils/console_app.py: Utility functions for console interactions.
- src/langgraph.json: Configuration for the LangGraph integration.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Contributing
Contributions are welcome! Please open an issue or submit a pull request for any improvements or bug fixes.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.