FluffOS MCP Server

FluffOS MCP Server

Enables AI assistants to validate LPC code and examine compiled bytecode using FluffOS driver tools, providing real driver-level compilation checking and performance analysis for MUD development.

Category
Visit Server

README

FluffOS MCP Server

Real driver validation for LPC development - An MCP server that wraps FluffOS CLI tools to provide actual driver-level validation and debugging.

This MCP server exposes FluffOS's powerful CLI utilities (symbol and lpcc) to AI assistants, enabling them to validate LPC code against the actual driver and examine compiled bytecode.

What This Enables

AI assistants can now:

  • Validate LPC files using the actual FluffOS driver (not just syntax checking)
  • Catch runtime compilation issues that static analysis misses
  • Examine compiled bytecode to debug performance or behavior issues
  • Understand how LPC code actually compiles

Tools

  • 🔍 fluffos_validate: Validate an LPC file using FluffOS's symbol tool
  • 🔬 fluffos_disassemble: Disassemble LPC to bytecode using lpcc
  • 📚 fluffos_doc_lookup: Search FluffOS documentation for efuns, applies, concepts, etc.

Prerequisites

1. FluffOS Installation

You need FluffOS installed with the CLI tools available. The following binaries should exist:

  • symbol - For validating LPC files
  • lpcc - For disassembling to bytecode

2. Node.js

Node.js 16+ required:

node --version  # Should be v16.0.0 or higher

3. Install Dependencies

cd /path/to/fluffos-mcp
npm install

Configuration

The server requires these environment variables:

  • FLUFFOS_BIN_DIR - Directory containing FluffOS binaries (symbol, lpcc)
  • MUD_RUNTIME_CONFIG_FILE - Path to your FluffOS config file (e.g., /mud/lib/etc/config.test)
  • FLUFFOS_DOCS_DIR - (Optional) Directory containing FluffOS documentation for doc lookup

Setup for Different AI Tools

Warp (Terminal)

Add to your Warp MCP configuration:

Location: Settings → AI → Model Context Protocol

{
  "fluffos": {
    "command": "node",
    "args": ["/absolute/path/to/fluffos-mcp/index.js"],
    "env": {
      "FLUFFOS_BIN_DIR": "/path/to/fluffos/bin",
      "MUD_RUNTIME_CONFIG_FILE": "/mud/lib/etc/config.test",
      "FLUFFOS_DOCS_DIR": "/path/to/fluffos/docs"
    }
  }
}

Important: Use absolute paths!

Restart Warp after adding the configuration.

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or equivalent:

{
  "mcpServers": {
    "fluffos": {
      "command": "node",
      "args": ["/absolute/path/to/fluffos-mcp/index.js"],
      "env": {
        "FLUFFOS_BIN_DIR": "/path/to/fluffos/bin",
        "MUD_RUNTIME_CONFIG_FILE": "/mud/lib/etc/config.test"
      }
    }
  }
}

Restart Claude Desktop after configuration.

Usage Examples

Once configured, you can ask your AI assistant:

"Validate this LPC file with the actual driver" → AI uses fluffos_validate to run symbol

"Show me the bytecode for this function" → AI uses fluffos_disassemble to run lpcc

"Why is this code slow?" → AI examines the disassembly to identify inefficient patterns

"What's the syntax for call_out?" → AI uses fluffos_doc_lookup to search documentation

"How do I use mappings?" → AI searches docs for mapping-related documentation

How It Works

AI Assistant
    ↓ (natural language)
  MCP Protocol
    ↓ (tool calls: fluffos_validate, fluffos_disassemble)
  This Server
    ↓ (spawns: symbol, lpcc)
  FluffOS CLI Tools
    ↓ (validates/compiles with actual driver)
  Your LPC Code
  1. AI assistant sends MCP tool requests
  2. Server spawns appropriate FluffOS CLI tool
  3. CLI tool validates/disassembles using the driver
  4. Server returns results to AI
  5. AI understands your code at the driver level!

Complementary Tools

This server works great alongside:

  • lpc-mcp - Language server integration for code intelligence
  • VS Code with jlchmura's LPC extension - IDE support

Use them together for the complete LPC development experience!

Contributing

PRs welcome! This is a simple wrapper that can be extended with more FluffOS tools.

Credits

  • FluffOS Team - For the amazing driver and CLI tools
  • Model Context Protocol - Making this integration possible

License

Unlicense - Public Domain. Do whatever you want with this code.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured