Model Context Protocol (MCP)
๐ OpenClient- The CLI-Based Universal AI Application Connector! An open-source Model Context Protocol (MCP) implementation that turbocharges LLMs by context provisioning standardization. Quickly connect a server of your choice with our client to boost your AI capabilities. Ideal for developers creating next-generation AI applications!
Techiral
README
Model Context Protocol (MCP)
MCP is an open protocol that standardizes how applications provide context to LLMs - think of it like USB-C for AI applications. It enables seamless connection between AI models and various data sources/tools.
๐ Why MCP?
MCP helps build agents and complex workflows on top of LLMs by providing:
- Pre-built integrations for your LLM to plug into
- Flexibility to switch between LLM providers
- Secure data handling best practices
- Standardized interface for AI applications
๐๏ธ Core Components
flowchart LR
A[MCP Host] --> B[MCP Client]
B --> C[Terminal]
B --> D[Filesystem]
B --> E[Memory]
C --> F[Local Data]
D --> G[Local Files]
E --> H[Remote APIs]
- MCP Hosts: Applications (like Claude Desktop, IDEs) that need AI context
- MCP Clients: Protocol handlers that manage server connections
- MCP Servers: Lightweight programs exposing specific capabilities:
- Terminal Server: Execute commands
- Filesystem Server: Access local files
- Memory Server: Persistent data storage
- Data Sources:
- Local: Files, databases on your machine
- Remote: Web APIs and cloud services
๐ System Overview
flowchart LR
User --> Client
Client --> AI[AI Processing]
Client --> Terminal[Terminal]
Client --> Filesystem[Filesystem]
Client --> Memory[Memory]
Core Components:
- AI Processing: Google Gemini + LangChain for natural language understanding
- Terminal Server: Executes system commands in isolated workspace
- Filesystem Server: Manages file operations
- Memory Server: Stores and retrieves persistent data
Key Features:
- Automatic server startup as needed
- Secure workspace isolation
- Flexible configuration
- Extensible architecture
๐ File Structure
flowchart TD
A[mcp] --> B[clients]
A --> C[servers]
A --> D[workspace]
B --> E[mcp-client]
E --> F[main.py]
E --> G[client.py]
E --> H[config.json]
E --> I[.env]
C --> J[terminal]
J --> K[server.py]
D --> L[memory.json]
D --> M[notes.txt]
Key Files:
clients/mcp-client/main.py
: Main client entry pointclients/mcp-client/langchain_mcp_client_wconfig.py
: AI integrationclients/mcp-client/theailanguage_config.json
: Server configurationsclients/mcp-client/.env
: Environment variablesservers/terminal_server/terminal_server.py
: Terminal serverworkspace/memory.json
: Persistent memory storageworkspace/notes.txt
: System notes
File Type Breakdown:
-
Python Files (60%):
- Core application logic and business rules
- Server implementations and client applications
- Includes both synchronous and asynchronous code
- Follows PEP 8 style guidelines
-
JSON Files (20%):
- Configuration files for servers and services
- API request/response schemas
- Persistent data storage format
- Strict schema validation enforced
-
Text Files (15%):
- System documentation (READMEs, guides)
- Developer notes and annotations
- Temporary data storage
- Plaintext logs and outputs
-
Other Formats (5%):
- Environment files (.env)
- Git ignore patterns
- License information
- Build configuration files
๐ Client Components
flowchart TD
A[User Input] --> B[Client]
B --> C{Type?}
C -->|Command| D[Terminal]
C -->|File| E[Filesystem]
C -->|Memory| F[Storage]
C -->|AI| G[Gemini]
D --> H[Response]
E --> H
F --> H
G --> H
H --> I[Output]
Main Client Files:
langchain_mcp_client_wconfig.py
: Main client applicationtheailanguage_config.json
: Server configurations.env
: Environment variables
Key Features:
- Manages multiple MCP servers
- Integrates Google Gemini for natural language processing
- Handles dynamic response generation
- Processes LangChain objects
Configuration:
- theailanguage_config.json:
{
"mcpServers": {
"terminal_server": {
"command": "uv",
"args": ["run", "../../servers/terminal_server/terminal_server.py"]
},
"memory": {
"command": "npx.cmd",
"args": ["@modelcontextprotocol/server-memory"],
"env": {"MEMORY_FILE_PATH": "workspace/memory.json"}
}
}
}
- .env Setup:
GOOGLE_API_KEY=your_api_key_here
THEAILANGUAGE_CONFIG=clients/mcp-client/theailanguage_config.json
Setup Steps:
- Create
.env
file inclients/mcp-client/
- Add required variables
- Restart client after changes
๐ฅ๏ธ Server Components
classDiagram
class TerminalServer {
+path: String
+run()
+validate()
+execute()
}
TerminalServer --|> FastMCP
class FastMCP {
+decorate()
+transport()
}
Terminal Server
- Purpose: Executes system commands in isolated workspace
- Key Features:
- Fast command execution
- Secure workspace isolation
- Comprehensive logging
- Technical Details:
- Uses
FastMCP
for transport - Validates commands before execution
- Captures and returns output
- Uses
Workspace Files
memory.json
- Purpose: Persistent data storage
- Operations:
- Store/update/read data
- Query specific information
- Example Structure:
{
"user_preferences": {
"favorite_color": "blue",
"interests": ["science fiction"]
},
"system_state": {
"last_commands": ["git status", "ls"]
}
}
notes.txt
- Purpose: System documentation and notes
- Content Types:
- User documentation (40%)
- System notes (30%)
- Temporary data (20%)
- Other (10%)
๐ ๏ธ Local Setup Guide
Prerequisites
- Python 3.9+
- Node.js 16+
- Google API Key
- UV Package Manager
Installation Steps
-
Clone the repository:
git clone https://github.com/Techiral/mcp.git cd mcp
-
Set up Python environment:
python -m venv venv # Linux/Mac: source venv/bin/activate # Windows: venv\Scripts\activate pip install -r requirements.txt
-
Configure environment variables:
echo "GOOGLE_API_KEY=your_key_here" > clients/mcp-client/.env echo "THEAILANGUAGE_CONFIG=clients/mcp-client/theailanguage_config.json" >> clients/mcp-client/.env
-
Install Node.js servers:
npm install -g @modelcontextprotocol/server-memory @modelcontextprotocol/server-filesystem
Verification Checklist:
- [x] Repository cloned
- [x] Python virtual environment created and activated
- [x] Python dependencies installed
- [x] .env file configured
- [x] Node.js servers installed
๐ Usage Instructions
Basic Usage
- Start the client:
python clients/mcp-client/langchain_mcp_client_wconfig.py
- Type natural language requests and receive responses
Command Examples
File Operations:
Create a file named example.txt
Search for "function" in all Python files
Count lines in main.py
Web Content:
Summarize https://example.com
Extract headlines from news site
System Commands:
List files in current directory
Check Python version
Run git status
Memory Operations:
Remember my favorite color is blue
What preferences did I set?
Show recent commands
Server Configuration
Key Configuration Files:
theailanguage_config.json
: Main server configurations.env
: Environment variables
Example Server Configs:
{
"terminal_server": {
"command": "uv",
"args": ["run", "servers/terminal_server/terminal_server.py"]
},
"memory": {
"command": "npx.cmd",
"args": ["@modelcontextprotocol/server-memory"],
"env": {"MEMORY_FILE_PATH": "workspace/memory.json"}
}
}
Configuration Tips:
- Use absolute paths for reliability
- Set environment variables for sensitive data
- Restart servers after configuration changes
๐ ๏ธ Troubleshooting
Common Issues & Solutions:
-
Authentication Problems:
- Verify Google API key in
.env
- Check key has proper permissions
- Regenerate key if needed
- Verify Google API key in
-
File Operations Failing:
# Check permissions ls -la workspace/ # Restart filesystem server npx @modelcontextprotocol/inspector uvx mcp-server-filesystem
-
Memory Operations Failing:
# Verify memory.json exists ls workspace/memory.json # Restart memory server npx @modelcontextprotocol/server-memory
Debugging Tools:
- Enable verbose logging:
echo "LOG_LEVEL=DEBUG" >> clients/mcp-client/.env
- List running servers:
npx @modelcontextprotocol/inspector list
Support:
๐ค How to Contribute
Getting Started:
- Fork and clone the repository
- Set up development environment (see Local Setup Guide)
Development Workflow:
# Create feature branch
git checkout -b feature/your-feature
# Make changes following:
# - Python: PEP 8 style
# - JavaScript: StandardJS style
# - Document all new functions
# Run tests
python -m pytest tests/
# Push changes
git push origin feature/your-feature
Pull Requests:
- Reference related issues
- Describe changes clearly
- Include test results
- Squash commits before merging
Code Review:
- Reviews typically within 48 hours
- Address all feedback before merging
Recommended Setup:
- VSCode with Python/JS extensions
- Docker for testing
- Pre-commit hooks
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.