PypReader-MCP
An MCP server that enables AI models to inspect local Python packages by retrieving PyPI descriptions, file structures, and source code. It allows agents to understand third-party libraries directly from a specified environment for tasks like code analysis and dependency checking.
README
pypreader-mcp
An MCP-Server that reads the contents of Python packages. This Server allows Large Language Models (LLMs) and other AI Agents to inspect the contents of Python packages in a specified environment.
Overview
pypreader-mcp acts as a bridge between AI models and the local Python environment. By exposing a set of tools via MCP, it enables AI to programmatically browse installed packages, view their file structures, and read their source code. This is useful for tasks such as code analysis, dependency checking, and automated programming assistance.
Why build this?
When I use AI-integrated programming IDEs like Cursor or Trae, I always find that the currently used model is unaware of the third-party libraries I need.
But sometimes they pretend to know and seriously generate a bunch of indescribable stuff; other times they search the internet, but the results are mostly not good, making it hard to find any useful information.
So I created this MCP service. It can read documentation from the official website pypi.org or read source code from the site-packages environment corresponding to your Python, allowing for a more direct understanding of the contents of the third-party libraries you want to use.
Features
This Server provides the following tools to MCP clients:
get_pypi_description(package_name: str): Retrieve the official description of a package from PyPI.get_package_directory(package_name: str): List the entire file and directory structure of a specified installed package.get_source_code_by_path(package_path: str): Retrieve the complete source code of a specific file within a package.get_source_code_by_symbol(package_path: str, symbol_name: str): Get the definition (code snippet) of a specified symbol (function, class, etc.).
Usage
This tool is designed to act as an MCP Server for use in AI-based environments such as Cursor or Trae.
Configuration
In the MCP Server configuration of your AI environment, add a new Server with the following settings. This allows the AI to run the service directly from its Git repository using uvx.
{
"mcpServers": {
"PypReader-MCP": {
"command": "uvx",
"args": [
"--from",
"git+https://github.com/zakahan/pypreader-mcp.git",
"pypreader-mcp"
],
"env": {
"CURRENT_PYTHON_PATH": "<your-python-path>",
"CURRENT_LOGGING_LEVEL": "INFO"
}
}
}
}
MCP-Server Parameters
When configuring the MCP Server in your AI environment, you can specify the following env parameters:
CURRENT_PYTHON_PATH: Specifies the path to the Python executable of the target package installation environment. If not provided, it defaults to the Python executable running the Server. You can find the correct path by activating your project's Python environment and runningwhich pythonin the terminal.CURRENT_LOGGING_LEVEL: Sets the logging level of the Server. Options areDEBUG,INFO,WARNING,ERROR,CRITICAL. The default value isINFO.
If you use a Python virtual environment to configure your Python project, you typically need to modify python_path as needed to switch to your specified Python environment.
AI Coding Example
Take Trae as an example. As of now (2025-07-02), the doubao-seed-1.6 model is unaware of the fastmcp package (in fact, most models don't recognize it either). Under normal circumstances, it would either pretend to know and output a bunch of messy, indescribable incorrect content (even thinking I'm talking about FastAPI), or perform a clumsy search and find all sorts of disorganized information.
This time, I created a Trae agent equipped with this project's mcp-server. The result is as follows: Trae can understand my project and then write a fastmcp service to complete my task.
This is the entire process of Trae completing my requested task

These are the specific details of the tool calls

License
This project is licensed under the MIT License. For details, please refer to the LICENSE file.
What's New Now?
- 2025-07-06: Rewrote
get_source_code_by_symbolto use reading and writing temporary files instead of stdio to support the Windows platform. - 2025-07-04: Rewrote the
get_source_code_by_symboltool to fix the issue where it couldn't read classes or functions belonging to sub-packages.
What's Next?
- Resolve the issue where
python-package-nameis inconsistent with the actual path, e.g., the package namegoogle-adkhas an actual path ofgoogle/adkand is actually imported asgoogle.adk - Design a suitable prompt for the Agent. (The Trae version is here)
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.