
IDE Chat Summarizer
Transforms chat conversations with AI into structured markdown summaries and automatically saves them to organized files in your notes directory. Supports different summary styles, handles large conversations through chunking, and provides tools to manage your saved summaries.
README
IDE Chat Summarizer MCP Server
A Model Context Protocol (MCP) server designed for IDE users to summarize chat conversations with AI and automatically store them as organized markdown files in your notes directory. Perfect for VS Code, Cursor, and Visual Studio users.
🎯 Purpose
This MCP server transforms chat conversations into structured summaries and saves them to your configured notes directory (default: ~/Documents/ChatSummaries
) for easy reference and organization. Perfect for keeping track of important discussions, decisions, and insights from your AI conversations.
✨ Features
🔧 Tools
-
summarize_chat
: Summarize chat history and save as markdown- Supports different summary styles (brief, detailed, bullet_points)
- NEW: Smart code detection and preservation of final solutions
- NEW: Options for handling large histories (include_full_history, create_separate_full_history)
- Auto-generates timestamped filenames
- Custom titles for better organization
- Smart handling: Uses collapsible sections for large conversations (>1MB)
-
summarize_large_chat
: Handle extremely large chat histories (NEW!)- Automatically chunks huge conversations into manageable pieces
- Creates individual files for each chunk with overlap for context
- Generates a master summary file linking all chunks
- Configurable chunk size (default: 50,000 characters)
- Perfect for multi-hour conversations or extensive documentation
-
list_summaries
: View recent chat summaries- Shows creation dates and file sizes
- Configurable limit for results
- Sorted by most recent first
-
delete_summary
: Remove unwanted summary files- Safety checks to only delete chat summary files
- Confirmation messages
📄 Resources
summary://{filename}
: Read content of specific summary filesnotes://directory
: Get information about your notes directory
💬 Prompts
create_summary_prompt
: Generate customized prompts for different conversation types- Supports: general, technical, meeting, brainstorm
- Focus areas: all, decisions, action_items, insights
🚀 Installation and Setup
Prerequisites
- uv package manager
- Python 3.13+
Install Dependencies
uv sync
Run the Server
1. Using MCP Inspector (Recommended for Development)
uv run mcp dev main.py
Opens a web interface at http://localhost:6274
for testing.
2. Direct Server Run
uv run python main.py
3. Using MCP CLI
uv run mcp run main.py
📝 Usage Examples
Handling Large Chat Histories
For huge chat histories, you have several options depending on your client:
Option 1: Standard with Full History (Recommended)
In VS Code/Cursor/Visual Studio:
- Use the
summarize_chat
tool - Parameters to set:
chat_history
: Copy and paste your entire conversationtitle
: "Long AI Discussion" (optional)summary_style
: Choose "detailed", "brief", or "bullet_points"include_full_history
: ✅ True (default - keeps everything!)create_separate_full_history
: ❌ False
Result: Creates 1 file with smart organization - collapsible sections for large histories (>1MB)
Option 2: Separate Files for Organization
In VS Code/Cursor/Visual Studio:
- Use the
summarize_chat
tool - Parameters to set:
chat_history
: Your huge conversationtitle
: "Extended Coding Session"summary_style
: "detailed"include_full_history
: ✅ Truecreate_separate_full_history
: ✅ True (creates 2 files!)
Result: Creates 2 files: chat_summary_*.md
+ chat_full_*.md
Option 3: Extremely Large Histories (Chunking)
In VS Code/Cursor/Visual Studio:
- Use the
summarize_large_chat
tool (NEW!) - Parameters to set:
chat_history
: Your massive conversationtitle
: "All Day Coding Session"chunk_size
: 50000 (characters per chunk)overlap
: 5000 (overlap between chunks for context)
Result: Creates master summary + individual chunk files with preserved context
Summarizing a Regular Chat
In VS Code/Cursor/Visual Studio:
- Use the
summarize_chat
tool - Fill in the parameters:
chat_history
:User: How do I optimize my Python code? AI: Here are several optimization techniques... User: What about memory usage? AI: For memory optimization, consider...
title
: "Python Optimization" (optional)summary_style
: "detailed" (or "brief", "bullet_points")
Result: Creates chat_summary_20240115_143022_Python_Optimization.md
with summary and full conversation
Listing Your Summaries
In VS Code/Cursor/Visual Studio:
- Use the
list_summaries
tool - Optional parameter:
limit
: 10 (number of summaries to show)
Result: Shows recent summaries with dates, sizes, and filenames
Reading a Summary
In VS Code/Cursor/Visual Studio:
- Use the
summary://filename
resource - Parameter:
filename
:chat_summary_20240115_143022_Python_Optimization.md
Result: Returns full content of the summary file
Managing Summaries
In VS Code/Cursor/Visual Studio:
- Use the
delete_summary
tool - Parameter:
filename
: Name of the file to delete
Result: Safely removes the summary file
🎯 Quick Start Guide for Large Histories
Step-by-Step: Saving Your Huge Chat
-
📋 Copy your entire conversation from your chat interface
-
🔧 Open MCP tools in VS Code/Cursor/Visual Studio
-
⚙️ Choose your approach:
For most large chats (recommended):
- Tool:
summarize_chat
- Paste conversation in
chat_history
- Set
include_full_history
:true
- Leave other defaults
For extremely large chats (>100MB):
- Tool:
summarize_large_chat
- Paste conversation in
chat_history
- Leave defaults (50k chunk size)
- Tool:
-
🚀 Run the tool - Your conversation is now safely stored!
What You Get:
- ✅ Complete preservation - Nothing lost from original
- ✅ Smart organization - Easy to read and navigate
- ✅ Searchable files - Find anything quickly
- ✅ Multiple formats - Summary + full history options
📁 File Organization
Summaries are saved to your configured notes directory (default: ~/Documents/ChatSummaries
)
Filename Pattern: chat_summary_YYYYMMDD_HHMMSS_[title].md
Example Files:
chat_summary_20240115_143022_Python_Optimization.md
chat_summary_20240115_150330_API_Design_Discussion.md
chat_summary_20240115_162145_Untitled.md
📋 Summary Styles
Brief
- 2-3 sentence overview
- Key points only
- Quick reference
Detailed (Default)
- Comprehensive summary
- Main topics and subtopics
- Key decisions and insights
- Structured with headings
Bullet Points
- Organized bullet list format
- Main topics as bullets
- Easy to scan
- Action-oriented
💻 Smart Code Detection
The summarizer automatically detects and preserves important code from your conversations:
🎯 What It Detects:
- Code blocks (
language ...
) - Inline code (
code
) - Final solutions - Code that appears near keywords like "final", "solution", "working", "complete"
- Late-conversation code - Code blocks in the last 30% of the conversation (likely to be solutions)
🔍 How It Works:
- Scans conversation for all code blocks and inline code
- Identifies final solutions using context analysis and position weighting
- Preserves in summary with proper syntax highlighting
- Organizes by importance - Final solutions first, then other code snippets
📋 What You Get:
- 💻 Final Code Solutions section with working code
- 📝 Code Snippets section with relevant code examples
- Language detection and proper syntax highlighting
- Context preservation - knows which code is the final answer
🔧 Client Integration
VS Code
When adding this MCP server to VS Code:
Command:
uv run --directory "/path/to/your/ide-chat-summarizer-mcp" python main.py
Replace /path/to/your/ide-chat-summarizer-mcp
with your actual project directory path.
Server Name: chat-summarizer
Cursor
In Cursor settings:
- Go to Extensions → MCP
- Add Server with:
- Name:
chat-summarizer
- Command:
uv run --directory "/path/to/your/ide-chat-summarizer-mcp" python main.py
- Working Directory:
/path/to/your/ide-chat-summarizer-mcp
- Name:
Visual Studio (Full IDE)
Visual Studio has MCP support! Here's how to configure it:
Native MCP Configuration
- Open the Configure MCP server dialog in Visual Studio
- Fill out the configuration:
Server ID:
chat-summarizer
Type:
stdio
(Keep this as default)
Command (with optional arguments):
uv run --directory "/path/to/your/ide-chat-summarizer-mcp" python main.py
(Replace /path/to/your/mcp-server-demo
with your actual project path)
Environment Variables (Optional):
- Click "+ Add" if you want a custom notes directory
- Name:
CHAT_NOTES_DIR
- Value:
/path/to/your/custom/notes/directory
- Click "Save"
- Restart Visual Studio
🎯 Typical Workflow
For VS Code/Cursor/Visual Studio:
- 💻 Power up computer
- 🚀 Open VS Code/Cursor/Visual Studio
- 💬 Start chatting with AI
- 🔧 Use MCP tools directly - server starts automatically!
For MCP Inspector (Testing):
- 💻 Power up computer
- 📁 Navigate to project directory
- ⚡ Run:
uv run mcp dev main.py
- 🌐 Use web interface at
http://localhost:6274
💡 Pro Tip: Most clients (VS Code, Cursor, Visual Studio) automatically start your MCP server when you use the tools. You only need to manually start the server for testing with MCP Inspector.
📊 Directory Information
The server provides insights about your notes directory:
- Total markdown files count
- Chat summaries count
- Directory size
- Last activity timestamp
🛠 Configuration
Notes Directory
Default: ~/Documents/ChatSummaries
(user's Documents folder)
Method 1: Environment Variable (Recommended)
Set the CHAT_NOTES_DIR
environment variable:
# Windows
set CHAT_NOTES_DIR=C:\Users\YourName\Notes\ChatSummaries
# macOS/Linux
export CHAT_NOTES_DIR="/home/username/Notes/ChatSummaries"
Method 2: Edit main.py
Modify the NOTES_DIR
variable in main.py
:
# Examples:
NOTES_DIR = Path("your/custom/path") # Custom absolute path
NOTES_DIR = Path.home() / "Notes" / "Summaries" # User's home directory
NOTES_DIR = Path.cwd() / "summaries" # Relative to project directory
Summary Templates
Customize summary styles by modifying the prompt generation in the summarize_chat
function.
🔍 Project Structure
ide-chat-summarizer-mcp/
├── main.py # MCP server implementation
├── pyproject.toml # Project configuration
├── README.md # This documentation
├── mcp-config.json # MCP server configuration
└── uv.lock # Dependency lock file
📚 Learn More
🤝 Usage Tips
- Organize by Topic: Use meaningful titles when summarizing
- Regular Cleanup: Use
delete_summary
to remove outdated summaries - Style Selection: Choose the right summary style for your needs
- Batch Processing: Use
list_summaries
to review and manage multiple summaries
Transform your conversations into organized, searchable knowledge with the Chat History Summarizer MCP Server!
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.