DeepView MCP

DeepView MCP

A Model Context Protocol server that enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini's extensive context window.

ai-1st

Developer Tools
Visit Server

README

DeepView MCP

DeepView MCP is a Model Context Protocol server that enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini's extensive context window.

PyPI version

Features

  • Load an entire codebase from a single text file (e.g., created with tools like repomix)
  • Query the codebase using Gemini's large context window
  • Connect to IDEs that support the MCP protocol, like Cursor and Windsurf
  • Configurable Gemini model selection via command-line arguments

Prerequisites

Installation

Using pip

pip install deepview-mcp

Usage

Starting the Server

Note: you don't need to start the server manually. These parameters are configured in your MCP setup in your IDE (see below).

# Basic usage with default settings
deepview-mcp [path/to/codebase.txt]

# Specify a different Gemini model
deepview-mcp [path/to/codebase.txt] --model gemini-2.0-pro

# Change log level
deepview-mcp [path/to/codebase.txt] --log-level DEBUG

The codebase file parameter is optional. If not provided, you'll need to specify it when making queries.

Command-line Options

  • --model MODEL: Specify the Gemini model to use (default: gemini-2.0-flash-lite)
  • --log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}: Set the logging level (default: INFO)

Using with an IDE (Cursor/Windsurf/...)

  1. Open IDE settings
  2. Navigate to the MCP configuration
  3. Add a new MCP server with the following configuration:
    {
      "mcpServers": {
        "deepview": {
          "command": "/path/to/deepview-mcp",
          "args": [],
          "env": {
            "GEMINI_API_KEY": "your_gemini_api_key"
          }
        }
      }
    }
    
    

Setting a codebase file is optional. If you are working with the same codebase, you can set the default codebase file using the following configuration:

{
   "mcpServers": {
     "deepview": {
       "command": "/path/to/deepview-mcp",
       "args": ["/path/to/codebase.txt"],
       "env": {
         "GEMINI_API_KEY": "your_gemini_api_key"
       }
     }
   }
 }

Here's how to specify the Gemini version to use:

{
   "mcpServers": {
     "deepview": {
       "command": "/path/to/deepview-mcp",
       "args": ["--model", "gemini-2.5-pro-exp-03-25"],
       "env": {
         "GEMINI_API_KEY": "your_gemini_api_key"
       }
     }
   }
}
  1. Reload MCP servers configuration

Available Tools

The server provides one tool:

  1. deepview: Ask a question about the codebase
    • Required parameter: question - The question to ask about the codebase
    • Optional parameter: codebase_file - Path to a codebase file to load before querying

Preparing Your Codebase

DeepView MCP requires a single file containing your entire codebase. You can use repomix to prepare your codebase in an AI-friendly format.

Using repomix

  1. Basic Usage: Run repomix in your project directory to create a default output file:
# Make sure you're using Node.js 18.17.0 or higher
npx repomix

This will generate a repomix-output.xml file containing your codebase.

  1. Custom Configuration: Create a configuration file to customize which files get packaged and the output format:
npx repomix --init

This creates a repomix.config.json file that you can edit to:

  • Include/exclude specific files or directories
  • Change the output format (XML, JSON, TXT)
  • Set the output filename
  • Configure other packaging options

Example repomix Configuration

Here's an example repomix.config.json file:

{
  "include": [
    "**/*.py",
    "**/*.js",
    "**/*.ts",
    "**/*.jsx",
    "**/*.tsx"
  ],
  "exclude": [
    "node_modules/**",
    "venv/**",
    "**/__pycache__/**",
    "**/test/**"
  ],
  "output": {
    "format": "xml",
    "filename": "my-codebase.xml"
  }
}

For more information on repomix, visit the repomix GitHub repository.

License

MIT

Author

Dmitry Degtyarev (ddegtyarev@gmail.com)

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
MCP Package Docs Server

MCP Package Docs Server

Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.

Featured
Local
TypeScript
Claude Code MCP

Claude Code MCP

An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.

Featured
Local
JavaScript
@kazuph/mcp-taskmanager

@kazuph/mcp-taskmanager

Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.

Featured
Local
JavaScript
Linear MCP Server

Linear MCP Server

Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.

Featured
JavaScript
mermaid-mcp-server

mermaid-mcp-server

A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.

Featured
JavaScript
Jira-Context-MCP

Jira-Context-MCP

MCP server to provide Jira Tickets information to AI coding agents like Cursor

Featured
TypeScript
Linear MCP Server

Linear MCP Server

A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Featured
JavaScript
Sequential Thinking MCP Server

Sequential Thinking MCP Server

This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.

Featured
Python