one-mcp
A lightweight MCP server that enables intelligent tool management and semantic search for APIs using sentence-transformers. It supports both REST and MCP interfaces across dual transport modes, allowing users to upload, manage, and query API tools with natural language.
README
๐ง one-mcp
๐ Overview
one-mcp is a lightweight MCP (Model Context Protocol) server built using FastAPI that enables intelligent tool management and semantic search for APIs.
It allows you to upload, manage, and query API tools using natural language โ powered by modern embedding models via sentence-transformers.
The server supports multiple transport modes (stdio, HTTP, or both) and provides both a REST API and MCP tool interface for maximum flexibility.
โจ Features
- ๐ Semantic Search: Find relevant API tools based on descriptive queries using sentence-transformers embeddings.
- ๐ค Upload Tools: Add new API tools via JSON body or file upload.
- ๐๏ธ Delete Tools: Remove specific tools by name (supports batch deletion).
- ๐งพ Tool Statistics: Get insights on stored tools including count, model, and storage path.
- ๐งน Tool Management: Clear, inspect, or modify your tool store easily.
- โก FastAPI Backend: High-performance, async-ready backend server.
- ๐ค MCP Compatibility: Dual interface - REST API and MCP tools for seamless integration.
- ๐ Dual Transport: Support for stdio and HTTP transports simultaneously.
- ๐พ Persistent Storage: Tools and embeddings saved to disk with automatic loading.
- ๐ Structured Logging: Comprehensive logging with rotating file handlers.
๐งฉ Project Structure
one-mcp/
โโโ server.py # Main application entry point with server orchestration
โโโ mcp_server.py # MCP server class with multi-transport support
โโโ api.py # FastAPI routes and REST endpoints
โโโ mcp_tools.py # MCP tool definitions and handlers
โโโ models.py # Pydantic models for request/response validation
โโโ tools_store.py # Persistent tool storage with embeddings
โโโ config.py # Server configuration and argument parsing
โโโ logging_setup.py # Centralized logging configuration
โโโ test_specs.json # Sample tool dataset for testing
โโโ CURLS.md # Example cURL commands for testing API endpoints
โโโ MCP_TOOLS.md # MCP tools documentation
โโโ requirements.txt # Project dependencies
โโโ Dockerfile # Docker containerization (CPU-based dependencies)
โโโ README.md # Project documentation (this file)
โ๏ธ Installation
1. Clone the Repository
git clone https://github.com/freakynit/one-mcp.git
cd one-mcp
2. Set Up Virtual Environment
python -m venv venv
source venv/bin/activate # macOS/Linux
venv\Scripts\activate # Windows
3. Install Dependencies
pip install -r requirements.txt
Dependencies include:
fastapi>=0.104.0
uvicorn>=0.24.0
fastmcp>=0.2.0
python-multipart>=0.0.6
torch==2.4.1
torchvision==0.19.1
torchaudio==2.4.1
sentence-transformers>=2.2.0
scikit-learn>=1.3.0
numpy>=1.24.0
๐ง Running the Server
Note: The first time you run the server, it will download the
all-MiniLM-L6-v2model from sentence-transformers. This may take a few seconds depending on your internet connection.
Start with Dual Transport (stdio + HTTP)
python server.py --transport stdio,http --port 8003
This enables both MCP stdio communication and HTTP REST API access.
HTTP-only Mode
python server.py --transport http --port 8003
Stdio-only Mode (for MCP clients)
python server.py --transport stdio
Using Uvicorn Directly
uvicorn server:app --host 0.0.0.0 --port 8003
Configuration Options
--transport: Transport mode (stdio, http, or stdio,http) - default: stdio--port: HTTP port number - default: 8000--host: Host to bind to - default: 0.0.0.0--storage_path: Path to store tool embeddings - default: tool_embeddings.json
By default, the server starts at:
๐ http://localhost:8003 (when HTTP transport is enabled)
The server automatically:
- Creates a
logs/directory for application logs - Loads existing tools from
tool_embeddings.jsonon startup - Saves tools to disk after any modification
๐งช Testing the API
The server provides two interfaces:
- REST API: Available at
/api/*endpoints (see CURLS.md for examples) - MCP Tools: Available via MCP protocol (see MCP_TOOLS.md for documentation)
REST API Endpoints
All endpoints return structured JSON responses with appropriate status codes.
Check Server Status
curl http://localhost:8003/api/status
Upload Tools via JSON
curl -X POST http://localhost:8003/api/tools/upload-json \
-H "Content-Type: application/json" \
-d '{"tools": [{"type": "function", "name": "get_weather", "description": "Get the current weather for a specific city.", "parameters": {"type": "object", "properties": {"city": {"type": "string", "description": "The name of the city to get weather for."}}}}]}'
Upload Tools via File
curl -X POST http://localhost:8003/api/tools/upload-file \
-F "file=@test_tools.json;type=application/json"
Search for Similar Tools
curl -X POST http://localhost:8003/api/tools/search \
-H "Content-Type: application/json" \
-d '{"query": "weather forecast for a city", "k": 3}'
Get Statistics
curl http://localhost:8003/api/tools/stats
Delete Specific Tools
curl -X DELETE http://localhost:8003/api/tools/delete \
-H "Content-Type: application/json" \
-d '{"tool_names": ["get_weather", "get_news_headlines"]}'
Clear All Tools
curl -X DELETE http://localhost:8003/api/tools/clear
MCP Access
The MCP endpoint is mounted at /mcp for HTTP streaming mode:
curl http://localhost:8003/mcp
For full MCP tool documentation, see MCP_TOOLS.md.
For more comprehensive testing examples, see CURLS.md.
๐งฐ Example MCP Configuration
To integrate with an MCP client (like Claude Desktop):
{
"mcpServers": {
"one-mcp-server": {
"command": "python",
"args": [
"/absolute/path/to/server.py",
"--transport", "stdio",
"--storage_path", "tool_embeddings.json"
]
}
}
}
For dual transport mode (stdio for MCP + HTTP for REST API):
{
"mcpServers": {
"one-mcp-server": {
"command": "python",
"args": [
"/absolute/path/to/server.py",
"--transport", "stdio,http",
"--port", "8004",
"--storage_path", "tool_embeddings.json"
]
}
}
}
๐๏ธ Architecture
Components
- server.py: Entry point that initializes the app and starts the MCP server
- mcp_server.py: Handles multi-transport server orchestration (stdio/HTTP/dual)
- api.py: FastAPI application factory and REST endpoint definitions
- mcp_tools.py: MCP tool decorators and function implementations
- tools_store.py: Singleton store for tool embeddings with search capability
- models.py: Pydantic models for type safety and validation
- config.py: Configuration management and CLI argument parsing
- logging_setup.py: Centralized logging with rotating file handlers
How It Works
- Tool Storage: Tools are stored with their embeddings using
sentence-transformers - Semantic Search: Query embeddings are compared using cosine similarity
- Persistence: Tools automatically saved to
tool_embeddings.json - Dual Interface: Same functionality available via REST API and MCP tools
- Multi-Transport: Server can run stdio (for MCP clients) and HTTP simultaneously
Dev
- Create zip:
zip -r one-mcp.zip . -x "*.git/*" -x ".env" -x ".DS_Store" -x ".dockerignore" -x ".gitignore"
๐งโ๐ป Contributing
Contributions are welcome! To contribute:
- Fork the repository
- Create a new feature branch (
git checkout -b feature/my-feature) - Commit your changes (
git commit -m "Add my feature") - Push to your fork (
git push origin feature/my-feature) - Submit a Pull Request
Before submitting, ensure:
- Code passes linting and basic tests.
- Youโve updated documentation if needed.
๐ License
This project is licensed under the MIT License โ see the LICENSE file for details.
๐ฌ Support
If you encounter any issues or have feature requests:
- Open an issue on GitHub
- Or contact @freakynit directly.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.