
MCP Hub
A sophisticated research assistant that orchestrates a 5-step workflow of connected AI agents to provide deep research capabilities including question enhancement, web search, summarization, citation formatting, and result combination.
README
MCP Hub Project - Deep Research & Code Assistant
Overview
The MCP (Model Context Protocol) Hub is a sophisticated research and code assistant built using Gradio's MCP server functionality. This project demonstrates how to build a workflow of connected AI agents that work together to provide deep research capabilities and generate executable Python code.
The system orchestrates an 8-step deep research and code generation workflow:
- Question Enhancement: Breaks down a user's original query into three distinct sub-questions
- Web Search: Conducts web searches for each sub-question using Tavily API
- LLM Summarization: Summarizes search results for each sub-question using Nebius LLMs
- Citation Formatting: Extracts and formats citations from web search results
- Result Combination: Merges all summaries into a comprehensive grounded context
- Code Generation: Creates Python code based on the research findings using Qwen2.5-Coder-32B-Instruct-fast
- Code Execution: Runs the generated code in a Modal sandbox environment
- Final Summary: Provides a natural language summary of the entire process
Features
- MCP Server Implementation: Built on Gradio's MCP server capabilities for seamless agent communication
- Multi-Agent Architecture: Demonstrates how to build interconnected agent services
- Real-time Web Search: Integration with Tavily API for up-to-date information
- LLM Processing: Uses Nebius (OpenAI-compatible) models for text processing
- Structured Workflow: Showcases a sophisticated multi-step AI research process
- Citation Generation: Automatically formats APA-style citations from web sources
- Code Generation: Creates executable Python code based on research findings
- Code Execution: Runs generated code in a Modal sandbox environment
- Final Summary: Provides a natural language summary of the entire process
Prerequisites
- Python 3.12+
- API keys for:
- Nebius API
- Tavily API
- Modal account (for code execution in sandbox)
Installation
- Clone this repository
- Create a virtual environment (recommended)
python -m venv venv
# Activate the virtual environment:
# On Windows:
venv\Scripts\activate
# On macOS/Linux:
source venv/bin/activate
- Install dependencies:
pip install gradio[mcp] openai tavily-python python-dotenv modal
# or use the pyproject.toml with your preferred Python package manager:
# pip install -e .
- Create a
.env
file with the following content:
NEBIUS_API_KEY=nb-...
TAVILY_API_KEY=tvly-...
CURRENT_YEAR=2025 # Optional, used for citation formatting
Usage
Run the main application:
python main.py
This will launch the Gradio interface at http://127.0.0.1:7860/
The MCP schema will be available at http://127.0.0.1:7860/gradio_api/mcp/schema
Available Agents
The project includes several agent services:
- Question Enhancer: Splits a request into three sub-questions using Qwen3-4B-fast
- Web Search Agent: Performs web searches via Tavily API (top-3 results)
- LLM Processor: Processes text with Nebius LLMs (Meta-Llama-3.1-8B-Instruct) for summarization, reasoning, or keyword extraction
- Citation Formatter: Extracts URLs and formats them as APA-style citations
- Code Generator: Creates Python code snippets based on research context using Qwen2.5-Coder-32B-Instruct-fast
- Code Runner: Executes Python code in a Modal sandbox environment
- Orchestrator: Coordinates all agents in a cohesive workflow
Tutorial Scripts
The tutorial_scripts/
directory contains example Gradio applications and code samples for learning:
simple_app.py
: A basic Gradio interfaceletter_count.py
: A simple letter counting examplepredict_letter_count.py
: Example of letter counting predictionmodal_inference.py
: Demonstrates using Modal for code executionnebius_inference.py
: Shows how to use Nebius API for inferencenebius_tool_calling.py
: Example of tool calling with Nebius modelsGradio Cheat Sheet.md
: Quick reference for Gradio features and usage
MCP Implementation Details
This project demonstrates how to:
- Create MCP-compatible function definitions with proper typing and docstrings
- Launch a Gradio app as an MCP server (
mcp_server=True
) - Structure a multi-agent workflow
- Pass data between agents in a structured format
- Execute code safely in a sandbox environment
Example Workflow
- A user submits a high-level request like "Write Python code to analyze sentiment from Twitter data"
- The system breaks this into three sub-questions (e.g., about Twitter APIs, sentiment analysis techniques, and Python libraries)
- For each sub-question, it:
- Performs a web search using Tavily
- Summarizes the search results
- Extracts citations from URLs
- The sub-summaries are combined into a comprehensive grounded context
- Based on this context, Python code is generated
- The code is executed in a Modal sandbox
- The user receives the final summary, generated code, execution output, and citations
License
[Your license information here]
Contributing
[Your contribution guidelines here]
Acknowledgments
- Gradio for providing the MCP server functionality
- Nebius for LLM capabilities
- Tavily for web search capabilities
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.