ghcontext: Supercharge Your LLMs with Real-time GitHub Context

ghcontext: Supercharge Your LLMs with Real-time GitHub Context

An MCP server providing real-time GitHub data to LLMs, enhancing their software development capabilities.

MarcoMuellner

Developer Tools
Visit Server

README

ghcontext: Supercharge Your LLMs with Real-time GitHub Context

License TypeScript Node.js MCP

"But my GitHub repo changed yesterday..." - Never worry about outdated information in your AI assistants again.

ghcontext (GitHub Context Provider) bridges the gap between GitHub and Large Language Models, giving AI assistants real-time access to repository information through the standardized Model Context Protocol (MCP).

<p align="center"> <img src="docs/images/ghcontext-diagram.png" alt="ghcontext Architecture" width="600"/> </p>

🔥 Why ghcontext?

  • Accurate, Real-time Information: LLMs often have outdated knowledge about repositories. ghcontext provides the latest API docs, README contents, and codebase structure - Also for private repos
  • Deeper Understanding: Help LLMs grasp your project's architecture, design principles, and API usage patterns.
  • Seamless Integration: Compatible with any MCP-enabled models, including Claude, GPT, and others.
  • Highly Efficient: Intelligent caching reduces API calls while keeping information fresh.

✨ Key Features

  • API Documentation Extraction: Automatically identifies and extracts API documentation from READMEs and dedicated documentation files
  • Repository Structure Analysis: Provides a map of your codebase's organization
  • README Content Retrieval: Gets the latest documentation directly from GitHub
  • File Content Search: Find and extract specific files or code snippets
  • Repository Search: Discover repositories matching specific criteria

🚀 Quick Start

A note on tokens

ghcontext requires a GitHub token for authentication. You are responsible for managing your token securely, and you should give it only the scopes really necessary for your use case. For example, if you only need to read public repositories, you can create a token with the public_repo scope. ghcontext does not need write access to your repositories.

Installation

Method 1: Run without Installation using npx

# Run directly without installation (GitHub token is REQUIRED)
npx ghcontext --GITHUB_TOKEN your_github_token

# OR using pnpm
pnpm dlx ghcontext --GITHUB_TOKEN your_github_token

This is the preferred way to give it to the claude agent, as it doesn't require any installation and you can run it directly from the command line.

Method 2: Global Installation from npm

# Install globally using npm
npm install -g ghcontext

# OR using pnpm
pnpm add -g ghcontext

# Run ghcontext with your GitHub token (REQUIRED)
ghcontext --GITHUB_TOKEN your_github_token

Method 3: Manual Installation (Development)

# Clone the repository
git clone https://github.com/yourusername/ghcontext.git
cd ghcontext

# Install dependencies
pnpm install

# Start the server with GitHub token (REQUIRED)
pnpm start --GITHUB_TOKEN your_github_token

Usage with LLMs

Connect your MCP-compatible LLM to the ghcontext server endpoint:

http://localhost:3000/api/mcp

Your LLM will now have access to tools like:

  • get-repository-info: Get detailed information about a repository
  • get-repository-readme: Retrieve the current README content
  • get-repository-api-docs: Extract API documentation
  • search-repository-files: Find files in a repository
  • get-file-content: Retrieve specific file contents

🔍 Example Scenario

Ask your MCP-enabled AI assistant:

"What are the available methods in the axios library for handling request interceptors?"

Instead of getting outdated or generic information, your assistant can:

  1. Use get-repository-api-docs to fetch the latest axios API documentation
  2. Analyze the current documentation for interceptor methods
  3. Provide you with accurate, up-to-date information

🧰 Architecture

ghcontext follows a modular design:

┌─────────────────┐       ┌──────────────┐       ┌────────────────┐
│   MCP Server    │◄─────►│  GitHub API  │◄─────►│  GitHub.com    │
│  (TypeScript)   │       │    Client    │       │                │
└────────┬────────┘       └──────────────┘       └────────────────┘
         │
         │
┌────────▼────────┐       ┌──────────────┐
│ Context         │       │   Caching    │
│ Processors      │◄─────►│   System     │
└─────────────────┘       └──────────────┘
  • MCP Server: Handles the Model Context Protocol communication
  • GitHub API Client: Manages GitHub REST and GraphQL API interactions
  • Context Processors: Extract and organize relevant information
  • Caching System: Improves performance and reduces API load

🧠 Why It Matters

Traditional AI assistants struggle with:

  • Outdated knowledge of repositories
  • Incomplete understanding of project structure
  • Inability to see recent changes and updates

ghcontext solves these problems by giving LLMs a direct line to GitHub's latest information, making your AI assistants more accurate, more helpful, and more in sync with your evolving codebase.

🛠️ Development

# Build the project
pnpm run build

# Run tests
pnpm test

# Lint your code
pnpm run lint

# Format your code
pnpm run format

📦 Publishing to npm

If you're a maintainer of this package and need to publish a new version:

  1. Update the version in package.json:

    # For patch releases (bug fixes)
    npm version patch
    
    # For minor releases (new features, no breaking changes)
    npm version minor
    
    # For major releases (breaking changes)
    npm version major
    
  2. Publish to npm:

    # The prepublishOnly script will run linting, tests, and build automatically
    npm publish
    
  3. Push tags to GitHub:

    git push --follow-tags
    

📝 License

This project is MIT licensed - see the LICENSE file for details.


<p align="center"> <i>ghcontext: Because your AI assistant should understand your code as well as you do.</i> </p>

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
MCP Package Docs Server

MCP Package Docs Server

Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.

Featured
Local
TypeScript
Claude Code MCP

Claude Code MCP

An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.

Featured
Local
JavaScript
@kazuph/mcp-taskmanager

@kazuph/mcp-taskmanager

Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.

Featured
Local
JavaScript
Linear MCP Server

Linear MCP Server

Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.

Featured
JavaScript
mermaid-mcp-server

mermaid-mcp-server

A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.

Featured
JavaScript
Jira-Context-MCP

Jira-Context-MCP

MCP server to provide Jira Tickets information to AI coding agents like Cursor

Featured
TypeScript
Linear MCP Server

Linear MCP Server

A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Featured
JavaScript
Sequential Thinking MCP Server

Sequential Thinking MCP Server

This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.

Featured
Python