
MCPy
A next-generation Minecraft server engine built with Python and Cython, aiming to deliver exceptional performance and flexibility through advanced scientific computing libraries and optimized architecture.
README
MCPy: High-Performance Minecraft Server Engine
MCPy is a next-generation, ultra-optimized Minecraft server engine powered by Python, Cython, and advanced scientific computing libraries. Our mission is to deliver exceptional performance and flexibility, making Minecraft server development both accessible and future-proof.
Note:
MCPy is under active development and is not yet feature-complete. The codebase contains known errors and is unstable. We welcome your bug reports and contributions to help us reach our goals faster!
🚧 Project Status
- The project is incomplete and contains known issues.
- Major features are under active development; the codebase is unstable.
- We highly value contributions and bug reports from the community.
🚀 Features at a Glance
- Cython-Accelerated Core: Event-driven server engine approaching C-level performance.
- Scientific Computing Backbone: Integrates NumPy, SciPy, and Polars for high-efficiency operations.
- Zero-Overhead Networking: Asynchronous, non-blocking, protocol-optimized networking.
- Sophisticated Entity System: Efficient, extensible entity management with advanced AI support.
- Robust Persistence Layer: Powered by PostgreSQL and SQLAlchemy ORM for reliable data storage.
- Comprehensive Benchmarking: Built-in performance analytics and profiling tools.
- Extensible Plugin Framework: Easily add server modifications.
- Real-Time Monitoring: Prometheus & Grafana integration for live metrics.
📐 Architecture Overview
MCPy is modular, with five high-performance core components:
-
server_core.pyx
- Event-driven request handling
- Adaptive, high-precision tick system
- Dynamic worker thread pool management
- Real-time performance profiling
-
world_engine.pyx
- Procedural terrain generation with multi-octave noise and advanced biomes
- Multi-threaded chunk generation & memory-efficient terrain storage
-
network_core.pyx
- Zero-copy packet serialization and protocol-level compression
- Robust connection pooling & DDoS mitigation
-
entity_system.pyx
- Spatial hash-based entity tracking and multi-threaded physics
- Modular AI behavior trees
-
persistence
- SQLAlchemy ORM for PostgreSQL/SQLite
- Efficient chunk serialization and transactional world state
📊 Performance Goals
Metric | Target Value |
---|---|
Scalability | 20 TPS with 100+ concurrent players |
Memory Usage | <2 GB for 10,000 chunks |
Latency | <50 ms per player action |
Reliability | 100% test coverage for core modules |
Throughput | 10,000+ entity updates per tick |
⚙️ Technical Highlights
Cython & Performance
- Static typing (
cdef
) and aggressive compiler directives - Direct NumPy buffer access and pointer arithmetic
- Multi-threaded parallelism via thread pools
Entity System
- Hierarchical, component-based design
- O(1) spatial partitioning via custom memory pools
- Adaptive Level-of-Detail (LOD) entity management
World Generation
- Multi-octave Perlin/Simplex noise
- Voronoi-based biome transitions
- Erosion, cave, and structure algorithms
- 10x chunk compression for storage efficiency
📦 Installation
Prerequisites
- Python 3.9+ (3.11+ recommended)
- Modern C++ compiler (VS 2019+ / GCC 9+)
- PostgreSQL 13+ (for production)
- Minimum 8 GB RAM (16 GB recommended)
Quick Setup
git clone https://github.com/magi8101/mcpy.git
cd mcpy
# Windows
setup.bat
# Linux/macOS
chmod +x setup.sh
./setup.sh
Manual Installation
git clone https://github.com/magi8101/mcpy.git
cd mcpy
python -m venv .venv
# Windows:
.venv\Scripts\activate
# Linux/macOS:
source .venv/bin/activate
pip install -r _requirements.txt
pip install -e ".[dev]"
pip install -e ".[ai]" # Optional: Enable AI features
python check_dependencies.py
python setup.py build_ext --inplace
🚀 Running the Server
# Using setup scripts
# Windows:
setup.bat run
# Linux/macOS:
./setup.sh run
# Directly from the command line
python -m mcpy.server
python -m mcpy.server --config custom_config.toml --world my_world
python -m mcpy.server --performance-mode --max-players 100
python -m mcpy.server --debug --log-level debug
Command Line Options
Option | Description |
---|---|
--config PATH |
Path to TOML config file |
--world PATH |
World directory |
--port NUMBER |
Network port (default: 25565) |
--max-players NUMBER |
Max players (default: 20) |
--view-distance NUMBER |
Chunk view distance (default: 10) |
--performance-mode |
Extra performance optimizations |
--debug |
Enable debug mode |
--log-level LEVEL |
Set log level (default: info) |
--backup |
Enable automatic backups |
🗄️ Database Configuration
SQLite (Default)
[database]
type = "sqlite"
path = "world/mcpy.db"
journal_mode = "WAL"
synchronous = "NORMAL"
PostgreSQL (Production)
[database]
type = "postgresql"
host = "localhost"
port = 5432
dbname = "mcpy"
user = "postgres"
password = "your_password"
pool_size = 10
max_overflow = 20
echo = false
💾 Persistence Features
- Transactional World Saving
with session.begin(): for chunk in dirty_chunks: session.add(ChunkModel.from_chunk(chunk))
- Efficient Chunk Serialization
chunk_data = np.savez_compressed(io_buffer, blocks=chunk.blocks, heightmap=chunk.heightmap, biomes=chunk.biomes)
- Player Data Management
player_model = PlayerModel( uuid=player.uuid, username=player.username, position=json.dumps([player.x, player.y, player.z]), inventory=pickle.dumps(player.inventory, protocol=5), stats=json.dumps(player.stats) )
- Intelligent Auto-saving: Only modified chunks/entities are saved
- Automated Backups: Configurable intervals & retention
🧪 Development & Testing
pytest # Run full test suite
pytest tests/test_entity_system.py # Entity system tests
python -m benchmarks.benchmark # Benchmarks
python -m mcpy.profiling.profile_module world_engine # Profile module
pytest --cov=mcpy --cov-report=html # Test coverage report
Performance Tuning Examples
- Entity System
entity_spatial_hash = {(int(e.x/16), int(e.z/16)): [] for e in entities} for entity in entities: entity_spatial_hash[(int(entity.x/16), int(entity.z/16))].append(entity)
- World Engine
with ThreadPoolExecutor(max_workers=os.cpu_count()) as executor: futures = [executor.submit(generate_chunk, x, z) for x, z in chunk_coords] chunks = [f.result() for f in futures]
- Network Optimization
cdef char* buffer = <char*>malloc(packet_size) memcpy(buffer, &packet_header, sizeof(packet_header)) memcpy(buffer + sizeof(packet_header), packet_data, packet_data_size)
🔧 Advanced Features
Plugin System
Add custom commands and behaviors easily:
from mcpy.plugins import Plugin, event
class TeleportPlugin(Plugin):
@event("player.command")
def on_command(self, player, command, args):
if command == "tp" and len(args) >= 1:
target = self.server.get_player_by_name(args[0])
if target:
player.teleport(target.x, target.y, target.z)
return True
return False
Real-time Monitoring
Integrated Prometheus/Grafana support:
[monitoring]
enabled = true
prometheus_port = 9090
metrics = ["tps", "memory_usage", "players_online", "chunks_loaded"]
AI Entity Behaviors
Flexible, behavior-tree-driven AI:
class ZombieAI(MobAI):
def setup_behaviors(self):
self.behaviors = BehaviorTree(
Selector([
Sequence([
CheckPlayerNearby(radius=16),
PathfindToPlayer(),
AttackPlayer()
]),
Sequence([
Wait(random.randint(20, 100)),
MoveToRandomPosition(radius=10)
])
])
)
🗺️ Roadmap
Short-Term
- [ ] Entity collision system
- [ ] Crafting & inventory management
- [ ] Basic combat mechanics
- [ ] World generation optimization
Medium-Term
- [ ] Multi-world support & portals
- [ ] Custom block behaviors
- [ ] Enhanced mob AI
- [ ] In-game scripting API
Long-Term
- [ ] Distributed server architecture
- [ ] Machine learning-driven mob AI
- [ ] Real-time ray-traced lighting
- [ ] Custom physics engine
🤝 Contributing
We welcome your contributions! Please see our Contributing Guide to get started:
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to your branch (
git push origin feature/amazing-feature
) - Open a Pull Request
📄 License
This project is licensed under the MIT License. See the LICENSE file for details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.