Code Knowledge Tool

Code Knowledge Tool

Provides a project memory bank and RAG context provider for enhanced code understanding and management through vector embeddings, integrated with RooCode and Cline.

davidvc

Programming Docs Access
AI Integration Systems
Git Management Tools
Visit Server

README

Code Knowledge Tool

A knowledge management tool for code repositories using vector embeddings. This tool helps maintain and query knowledge about your codebase using advanced embedding techniques.

Building and Installing

1. Build the Package

First, you need to build the distribution files:

# Clone the repository
git clone https://github.com/yourusername/code-knowledge-tool.git
cd code-knowledge-tool

# Create and activate a virtual environment
python -m venv venv
source venv/bin/activate

# Install build tools
python -m pip install --upgrade pip build

# Build the package
python -m build

This will create two files in the dist/ directory:

  • code_knowledge_tool-0.1.0-py3-none-any.whl (wheel file for installation)
  • code_knowledge_tool-0.1.0.tar.gz (source distribution)

2. Install the Package

Prerequisites

  1. Ensure Ollama is installed and running:
# Install Ollama (if not already installed)
curl https://ollama.ai/install.sh | sh

# Start Ollama service
ollama serve
  1. Install the package:
Option 1: Install from wheel file (recommended for usage)
# Navigate to where you built the package
cd /path/to/code_knowledge_tool

# Install from the wheel file
pip install dist/code_knowledge_tool-0.1.0-py3-none-any.whl
Option 2: Install in editable mode (recommended for development)

This option is best if you want to modify the tool or contribute to its development:

# Assuming you're already in the code-knowledge-tool directory
# and have activated your virtual environment

# Install in editable mode with development dependencies
pip install -e ".[dev]"

Integration with RooCode/Cline

  1. Copy the MCP configuration to your settings:

For Cline (VSCode):

# Open the settings file
open ~/Library/Application\ Support/Code/User/globalStorage/rooveterinaryinc.roo-cline/settings/cline_mcp_settings.json

Add this configuration:

{
  "mcpServers": {
    "code_knowledge": {
      "command": "python",
      "args": ["-m", "code_knowledge_tool.mcp_tool"],
      "env": {
        "PYTHONPATH": "${workspaceFolder}"
      }
    }
  }
}

For RooCode:

# Open the settings file
open ~/Library/Application\ Support/RooCode/roocode_config.json

Add the same configuration as above.

  1. Restart RooCode/Cline to load the new tool.

Using as Memory Bank and RAG Context Provider

This tool can serve as your project's memory bank and RAG context provider. To set this up:

  1. Copy the provided template to your project:
cp clinerules_template.md /path/to/your/project/.clinerules
  1. Customize the rules and patterns in .clinerules for your project's needs

The template includes comprehensive instructions for:

  • Knowledge base management
  • RAG-based development workflows
  • Code quality guidelines
  • Memory management practices

See clinerules_template.md for the full configuration and usage details.

Features

  • Local vector storage for code knowledge
  • Efficient embedding generation using Ollama
  • Support for multiple file types
  • Context-aware code understanding
  • Integration with RooCode and Cline via MCP
  • RAG-based context augmentation
  • Persistent knowledge storage

Requirements

  • Python 3.8 or higher
  • Ollama service running locally
  • chromadb for vector operations

Development

Running Tests

The project follows an integration-first testing approach, focusing on end-to-end functionality and MCP contract compliance. The test suite consists of:

  1. MCP Contract Tests

    • Tool registration and execution
    • Resource management
    • Knowledge operations
    • Error handling
  2. Package Build Tests

    • Installation verification
    • Dependency resolution
    • MCP server initialization
    • Basic functionality

To run the tests:

# Install test dependencies
pip install -e ".[dev]"

# Run all tests
pytest

# Run specific test suites
pytest tests/integration/test_mcp_contract.py -v  # MCP functionality
pytest tests/integration/test_package_build.py -v  # Installation verification

Test Environment Requirements:

# Ensure Ollama is running
ollama serve

The tests use a temporary directory (test_knowledge_store) that is cleaned up automatically between test runs.

For more details on the testing strategy and patterns, see the documentation in docs/.

Future Distribution

If you want to make this package available through pip (i.e., pip install code-knowledge-tool), you would need to:

  1. Register an account on PyPI
  2. Install twine: pip install twine
  3. Upload your distribution: twine upload dist/*

However, for now, use the local build and installation methods described above.

License

MIT License

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
AIO-MCP Server

AIO-MCP Server

🚀 All-in-one MCP server with AI search, RAG, and multi-service integrations (GitLab/Jira/Confluence/YouTube) for AI-enhanced development workflows. Folk from

Featured
Local
React MCP

React MCP

react-mcp integrates with Claude Desktop, enabling the creation and modification of React apps based on user prompts

Featured
Local
Atlassian Integration

Atlassian Integration

Model Context Protocol (MCP) server for Atlassian Cloud products (Confluence and Jira). This integration is designed specifically for Atlassian Cloud instances and does not support Atlassian Server or Data Center deployments.

Featured
Any OpenAI Compatible API Integrations

Any OpenAI Compatible API Integrations

Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.

Featured
Exa MCP

Exa MCP

A Model Context Protocol server that enables AI assistants like Claude to perform real-time web searches using the Exa AI Search API in a safe and controlled manner.

Featured