Codebase Insights MCP Server

Codebase Insights MCP Server

Analyzes API codebases from GitHub and Bitbucket repositories to generate Postman collections, business reports, and detailed code insights. Supports multiple frameworks including FastAPI, Spring Boot, Flask, Express, and OpenAPI/Swagger specifications.

Category
Visit Server

README

Codebase Insights MCP Server

An MCP (Model Context Protocol) server that provides comprehensive codebase analysis, API documentation, Postman collection generation, and business insights. Analyzes API codebases to generate Postman collections, Product Owner reports, and detailed code insights. Supports multiple frameworks including FastAPI, Spring Boot, Flask, Express, and any codebase with OpenAPI/Swagger specifications.

Features

  • 🚀 Multi-Framework Support: FastAPI, Spring Boot, Flask, Express, NestJS
  • 📋 OpenAPI/Swagger First: Automatically detects and uses OpenAPI specs when available
  • 🔍 Smart Code Analysis: Falls back to AST/pattern matching when no spec is found
  • 🔐 Authentication Support: Handles Bearer, Basic, and API Key authentication
  • 📁 Organized Output: Groups endpoints by tags or path segments
  • 🎯 Accurate Detection: Extracts request bodies, parameters, and response examples
  • 🤖 OpenAI ChatGPT Integration: SSE transport support for ChatGPT Connectors and Deep Research
  • 🔌 Multiple Transport Modes: stdio (local), HTTP (testing), SSE (OpenAI integration)

Installation

For End Users (via uvx)

The easiest way to use this MCP server is with uvx:

# Install and run directly
uvx codebase-insights-mcp

# Or install globally
uv tool install codebase-insights-mcp

For Development

# Clone the repository
git clone https://github.com/yourusername/codebase-insights-mcp.git
cd codebase-insights-mcp

# Install dependencies with Poetry
poetry install

# Run in development
poetry run codebase-insights-mcp

Configuration

Claude Desktop Configuration

Add this to your Claude Desktop configuration file:

For uvx installation (recommended):

{
  "mcpServers": {
    "codebase-insights": {
      "command": "uvx",
      "args": ["--no-cache", "codebase-insights-mcp"],
      "env": {
        "BITBUCKET_EMAIL": "your-username",
        "BITBUCKET_API_TOKEN": "your-app-password",
        "output_directory": "/path/to/output"
      }
    }
  }
}

For local development:

{
  "mcpServers": {
    "codebase-insights": {
      "command": "poetry",
      "args": ["run", "codebase-insights-mcp"],
      "cwd": "/path/to/codebase-insights-mcp",
      "env": {
        "BITBUCKET_EMAIL": "your-username",
        "BITBUCKET_API_TOKEN": "your-app-password",
        "output_directory": "/path/to/output"
      }
    }
  }
}

Environment Variables

  • BITBUCKET_EMAIL: Your Bitbucket username (not email)
  • BITBUCKET_API_TOKEN: Bitbucket app password with repository read access
  • GITHUB_TOKEN: GitHub personal access token (for private repos)
  • output_directory: Where to save generated Postman collections (default: current directory)

Usage

Once configured, you can use the following commands in Claude:

Generate a Postman collection from https://github.com/username/repo.git

Create Postman collection for https://bitbucket.org/team/api-project.git

The server will:

  1. Clone the repository
  2. Detect the framework and analyze the codebase
  3. Extract all API endpoints with their parameters and examples
  4. Generate a Postman v2.1 collection file
  5. Save it to your output directory

Supported Frameworks

With Full Support

  • FastAPI (Python) - Full OpenAPI integration
  • Spring Boot (Java) - Annotation-based detection
  • Express (Node.js) - Route pattern matching
  • Flask (Python) - Decorator-based detection
  • Django REST (Python) - ViewSet and path detection

Coming Soon

  • NestJS (TypeScript)
  • Ruby on Rails
  • ASP.NET Core

Publishing Updates to PyPI

One-Time Setup

# Configure PyPI token (get from https://pypi.org/manage/account/token/)
poetry config pypi-token.pypi YOUR-PYPI-TOKEN

Publishing Updates Workflow

  1. Update Version

    # Update version in pyproject.toml
    poetry version patch  # for bug fixes (0.1.0 -> 0.1.1)
    poetry version minor  # for new features (0.1.0 -> 0.2.0)
    poetry version major  # for breaking changes (0.1.0 -> 1.0.0)
    
  2. Build Package

    poetry build
    
  3. Publish to PyPI

    poetry publish
    
  4. Test Installation

    # Test the published package
    uvx --reinstall codebase-insights-mcp
    

Automated Version Workflow

# Complete update workflow
poetry version patch && poetry build && poetry publish

Users Update with uvx

After publishing, users can update with:

uvx --reinstall codebase-insights-mcp

Development

Running Tests

poetry run pytest

Code Quality

# Format code
poetry run black .

# Lint code
poetry run ruff check .

Local Testing with MCP Inspector

Option 1: Test Published Version (HTTP)

# Set environment variables
export BITBUCKET_EMAIL="your-username"
export BITBUCKET_API_TOKEN="your-app-password"
export output_directory="/tmp/postman-test"

# Run published version in HTTP mode
uvx codebase-insights-mcp@latest --transport http --port 8000

# Connect MCP Inspector to: http://localhost:8000/mcp

Option 2: Test Development Version (HTTP)

# From project directory
cd /path/to/codebase-insights-mcp

# Set environment variables
export BITBUCKET_EMAIL="your-username"
export BITBUCKET_API_TOKEN="your-app-password"
export output_directory="/tmp/postman-test"

# Run development version in HTTP mode
poetry run codebase-insights-mcp --transport http --port 8000

# Connect MCP Inspector to: http://localhost:8000/mcp

Testing SSE Transport (OpenAI ChatGPT Integration)

The SSE transport is required for OpenAI ChatGPT Connectors and Deep Research integration.

Local SSE Testing

Option 1: Test Published Version with SSE

# Set environment variables
export BITBUCKET_EMAIL="your-username"
export BITBUCKET_API_TOKEN="your-app-password"
export output_directory="/tmp/postman-test"

# Run published version in SSE mode
uvx codebase-insights-mcp@latest --transport sse --port 8000

# Server will be available at: http://localhost:8000/sse

Option 2: Test Development Version with SSE

# From project directory
cd /path/to/codebase-insights-mcp

# Set environment variables
export BITBUCKET_EMAIL="your-username"
export BITBUCKET_API_TOKEN="your-app-password"
export output_directory="/tmp/postman-test"

# Run development version in SSE mode
poetry run codebase-insights-mcp --transport sse --port 8001

# Server will be available at: http://localhost:8000/sse

Testing with OpenAI API

Once your SSE server is running and publicly accessible, test it with OpenAI:

curl https://api.openai.com/v1/responses \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -d '{
    "model": "gpt-4",
    "tools": [
      {
        "type": "mcp",
        "server_label": "codebase-insights",
        "server_description": "MCP server for generating Postman collections from code repositories",
        "server_url": "https://your-server.com/sse",
        "require_approval": "never"
      }
    ],
    "input": "Search for FastAPI repositories"
  }'

Available Tools for OpenAI Integration

The following tools are available when using SSE transport with OpenAI:

  1. search - Search for repositories and code (required by OpenAI ChatGPT Connectors)

    {
      "query": "FastAPI authentication"
    }
    
  2. fetch - Fetch full document content by ID (required by OpenAI ChatGPT Connectors)

    {
      "id": "doc-123"
    }
    
  3. generate_collection - Generate Postman collections from repositories

    {
      "repo_url": "https://github.com/user/repo.git"
    }
    
  4. generate_product_owner_overview - Generate business analysis reports

    {
      "repo_url": "https://github.com/user/repo.git"
    }
    
  5. analyze_repository_for_llm - Clone and return code for LLM analysis

    {
      "repo_url": "https://github.com/user/repo.git",
      "max_files": 50
    }
    

Transport Modes Comparison

Transport Use Case Endpoint Tools Available
stdio Local CLI (Claude Desktop, Claude Code) N/A (stdin/stdout) All tools
HTTP Development, MCP Inspector testing http://localhost:8000/mcp All tools
SSE OpenAI ChatGPT integration, Deep Research http://localhost:8000/sse All tools (especially search and fetch for ChatGPT)

When to use each transport:

  • stdio: Default for local MCP clients like Claude Desktop
  • HTTP: Best for testing with MCP Inspector during development
  • SSE: Required for OpenAI ChatGPT Connectors and Deep Research features

Using the Test Scripts

Interactive MCP Inspector Testing:

# Basic usage
./test_mcp_inspector.sh

# Test specific version
./test_mcp_inspector.sh --version 0.1.3

# Test latest version
./test_mcp_inspector.sh --latest

Automated HTTP API Testing:

# Test with default repository
./scripts/test_http_api.sh

# Test with custom repository
./scripts/test_http_api.sh "https://github.com/user/repo.git"

Testing with MCP Inspector

  1. Open MCP Inspector or run locally
  2. Enter URL: http://localhost:8000/mcp
  3. Click Connect
  4. Test tools:
    • generate_collection with: {"repo_url": "https://bitbucket.org/tymerepos/tb-payshap-svc.git"}
    • get_server_info with: {}

STDIO

  • Command /bin/bash
  • Arguments -c "cd '/Users/lukelanterme/Documents/Code/Personal/AI/Projects/codebase-insights-mcp' && poetry run codebase-insights-mcp"

export BITBUCKET_EMAIL="your-username" export BITBUCKET_API_TOKEN="your-app-password" export output_directory="/tmp/"

Architecture

The server follows clean architecture principles:

  • Models: Pydantic models for API endpoints and Postman collections
  • Analyzers: Framework-specific endpoint extraction logic
  • Generators: Postman collection generation from API models
  • Clients: Git repository access with authentication support

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

MIT License - see LICENSE file for details

Acknowledgments

  • Built with FastMCP framework
  • Inspired by API development workflows and the need for automation

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured