
cognee-mcp
Memory manager for AI apps and Agents using various graph and vector stores and allowing ingestion from 30+ data sources
Tools
cognify
Cognifies text into knowledge graph
codify
Transforms codebase into knowledge graph
search
Searches for information in knowledge graph
prune
Prunes knowledge graph
README
<div align="center"> <a href="https://github.com/topoteretes/cognee"> <img src="https://raw.githubusercontent.com/topoteretes/cognee/refs/heads/dev/assets/cognee-logo-transparent.png" alt="Cognee Logo" height="60"> </a>
<br />
cognee - memory layer for AI apps and Agents
<p align="center"> <a href="https://www.youtube.com/watch?v=1bezuvLwJmw&t=2s">Demo</a> . <a href="https://cognee.ai">Learn more</a> · <a href="https://discord.gg/NQPKmU5CCg">Join Discord</a> </p>
AI Agent responses you can rely on.
Build dynamic Agent memory using scalable, modular ECL (Extract, Cognify, Load) pipelines.
More on use-cases.
<div style="text-align: center"> <img src="https://raw.githubusercontent.com/topoteretes/cognee/refs/heads/dev/assets/cognee_benefits.png" alt="Why cognee?" width="100%" /> </div>
</div>
Features
- Interconnect and retrieve your past conversations, documents, images and audio transcriptions
- Reduce hallucinations, developer effort, and cost.
- Load data to graph and vector databases using only Pydantic
- Manipulate your data while ingesting from 30+ data sources
Get Started
Get started quickly with a Google Colab <a href="https://colab.research.google.com/drive/1g-Qnx6l_ecHZi0IOw23rg0qC4TYvEvWZ?usp=sharing">notebook</a> or <a href="https://github.com/topoteretes/cognee-starter">starter repo</a>
Contributing
Your contributions are at the core of making this a true open source project. Any contributions you make are greatly appreciated. See CONTRIBUTING.md
for more information.
📦 Installation
You can install Cognee using either pip, poetry, uv or any other python package manager.
With pip
pip install cognee
💻 Basic Usage
Setup
import os
os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
You can also set the variables by creating .env file, using our <a href="https://github.com/topoteretes/cognee/blob/main/.env.template">template.</a> To use different LLM providers, for more info check out our <a href="https://docs.cognee.ai">documentation</a>
Simple example
This script will run the default pipeline:
import cognee
import asyncio
async def main():
# Add text to cognee
await cognee.add("Natural language processing (NLP) is an interdisciplinary subfield of computer science and information retrieval.")
# Generate the knowledge graph
await cognee.cognify()
# Query the knowledge graph
results = await cognee.search("Tell me about NLP")
# Display the results
for result in results:
print(result)
if __name__ == '__main__':
asyncio.run(main())
Example output:
Natural Language Processing (NLP) is a cross-disciplinary and interdisciplinary field that involves computer science and information retrieval. It focuses on the interaction between computers and human language, enabling machines to understand and process natural language.
Graph visualization: <a href="https://rawcdn.githack.com/topoteretes/cognee/refs/heads/add-visualization-readme/assets/graph_visualization.html"><img src="assets/graph_visualization.png" width="100%" alt="Graph Visualization"></a> Open in browser.
For more advanced usage, have a look at our <a href="https://docs.cognee.ai"> documentation</a>.
Understand our architecture
<div style="text-align: center"> <img src="assets/cognee_diagram.png" alt="cognee concept diagram" width="100%" /> </div>
Demos
- What is AI memory:
- Simple GraphRAG demo
- cognee with Ollama
Code of Conduct
We are committed to making open source an enjoyable and respectful experience for our community. See <a href="https://github.com/topoteretes/cognee/blob/main/CODE_OF_CONDUCT.md"><code>CODE_OF_CONDUCT</code></a> for more information.
💫 Contributors
<a href="https://github.com/topoteretes/cognee/graphs/contributors"> <img alt="contributors" src="https://contrib.rocks/image?repo=topoteretes/cognee"/> </a>
Star History
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.