one-mcp

one-mcp

A lightweight MCP server that enables intelligent tool management and semantic search for APIs using sentence-transformers. It supports both REST and MCP interfaces across dual transport modes, allowing users to upload, manage, and query API tools with natural language.

Category
Visit Server

README

๐Ÿง  one-mcp

๐Ÿš€ Overview

one-mcp is a lightweight MCP (Model Context Protocol) server built using FastAPI that enables intelligent tool management and semantic search for APIs. It allows you to upload, manage, and query API tools using natural language โ€” powered by modern embedding models via sentence-transformers.

The server supports multiple transport modes (stdio, HTTP, or both) and provides both a REST API and MCP tool interface for maximum flexibility.


โœจ Features

  • ๐Ÿ” Semantic Search: Find relevant API tools based on descriptive queries using sentence-transformers embeddings.
  • ๐Ÿ“ค Upload Tools: Add new API tools via JSON body or file upload.
  • ๐Ÿ—‘๏ธ Delete Tools: Remove specific tools by name (supports batch deletion).
  • ๐Ÿงพ Tool Statistics: Get insights on stored tools including count, model, and storage path.
  • ๐Ÿงน Tool Management: Clear, inspect, or modify your tool store easily.
  • โšก FastAPI Backend: High-performance, async-ready backend server.
  • ๐Ÿค MCP Compatibility: Dual interface - REST API and MCP tools for seamless integration.
  • ๐Ÿ”„ Dual Transport: Support for stdio and HTTP transports simultaneously.
  • ๐Ÿ’พ Persistent Storage: Tools and embeddings saved to disk with automatic loading.
  • ๐Ÿ“Š Structured Logging: Comprehensive logging with rotating file handlers.

๐Ÿงฉ Project Structure

one-mcp/
โ”œโ”€โ”€ server.py           # Main application entry point with server orchestration
โ”œโ”€โ”€ mcp_server.py       # MCP server class with multi-transport support
โ”œโ”€โ”€ api.py              # FastAPI routes and REST endpoints
โ”œโ”€โ”€ mcp_tools.py        # MCP tool definitions and handlers
โ”œโ”€โ”€ models.py           # Pydantic models for request/response validation
โ”œโ”€โ”€ tools_store.py      # Persistent tool storage with embeddings
โ”œโ”€โ”€ config.py           # Server configuration and argument parsing
โ”œโ”€โ”€ logging_setup.py    # Centralized logging configuration
โ”œโ”€โ”€ test_specs.json     # Sample tool dataset for testing
โ”œโ”€โ”€ CURLS.md            # Example cURL commands for testing API endpoints
โ”œโ”€โ”€ MCP_TOOLS.md        # MCP tools documentation
โ”œโ”€โ”€ requirements.txt    # Project dependencies
โ”œโ”€โ”€ Dockerfile          # Docker containerization (CPU-based dependencies)
โ””โ”€โ”€ README.md           # Project documentation (this file)

โš™๏ธ Installation

1. Clone the Repository

git clone https://github.com/freakynit/one-mcp.git
cd one-mcp

2. Set Up Virtual Environment

python -m venv venv
source venv/bin/activate  # macOS/Linux
venv\Scripts\activate     # Windows

3. Install Dependencies

pip install -r requirements.txt

Dependencies include:

fastapi>=0.104.0
uvicorn>=0.24.0
fastmcp>=0.2.0
python-multipart>=0.0.6
torch==2.4.1
torchvision==0.19.1
torchaudio==2.4.1
sentence-transformers>=2.2.0
scikit-learn>=1.3.0
numpy>=1.24.0

๐Ÿง  Running the Server

Note: The first time you run the server, it will download the all-MiniLM-L6-v2 model from sentence-transformers. This may take a few seconds depending on your internet connection.

Start with Dual Transport (stdio + HTTP)

python server.py --transport stdio,http --port 8003

This enables both MCP stdio communication and HTTP REST API access.

HTTP-only Mode

python server.py --transport http --port 8003

Stdio-only Mode (for MCP clients)

python server.py --transport stdio

Using Uvicorn Directly

uvicorn server:app --host 0.0.0.0 --port 8003

Configuration Options

  • --transport: Transport mode (stdio, http, or stdio,http) - default: stdio
  • --port: HTTP port number - default: 8000
  • --host: Host to bind to - default: 0.0.0.0
  • --storage_path: Path to store tool embeddings - default: tool_embeddings.json

By default, the server starts at: ๐Ÿ‘‰ http://localhost:8003 (when HTTP transport is enabled)

The server automatically:

  • Creates a logs/ directory for application logs
  • Loads existing tools from tool_embeddings.json on startup
  • Saves tools to disk after any modification

๐Ÿงช Testing the API

The server provides two interfaces:

  1. REST API: Available at /api/* endpoints (see CURLS.md for examples)
  2. MCP Tools: Available via MCP protocol (see MCP_TOOLS.md for documentation)

REST API Endpoints

All endpoints return structured JSON responses with appropriate status codes.

Check Server Status

curl http://localhost:8003/api/status

Upload Tools via JSON

curl -X POST http://localhost:8003/api/tools/upload-json \
  -H "Content-Type: application/json" \
  -d '{"tools": [{"type": "function", "name": "get_weather", "description": "Get the current weather for a specific city.", "parameters": {"type": "object", "properties": {"city": {"type": "string", "description": "The name of the city to get weather for."}}}}]}'

Upload Tools via File

curl -X POST http://localhost:8003/api/tools/upload-file \
  -F "file=@test_tools.json;type=application/json"

Search for Similar Tools

curl -X POST http://localhost:8003/api/tools/search \
  -H "Content-Type: application/json" \
  -d '{"query": "weather forecast for a city", "k": 3}'

Get Statistics

curl http://localhost:8003/api/tools/stats

Delete Specific Tools

curl -X DELETE http://localhost:8003/api/tools/delete \
  -H "Content-Type: application/json" \
  -d '{"tool_names": ["get_weather", "get_news_headlines"]}'

Clear All Tools

curl -X DELETE http://localhost:8003/api/tools/clear

MCP Access

The MCP endpoint is mounted at /mcp for HTTP streaming mode:

curl http://localhost:8003/mcp

For full MCP tool documentation, see MCP_TOOLS.md.

For more comprehensive testing examples, see CURLS.md.


๐Ÿงฐ Example MCP Configuration

To integrate with an MCP client (like Claude Desktop):

{
  "mcpServers": {
    "one-mcp-server": {
      "command": "python",
      "args": [
        "/absolute/path/to/server.py",
        "--transport", "stdio",
        "--storage_path", "tool_embeddings.json"
      ]
    }
  }
}

For dual transport mode (stdio for MCP + HTTP for REST API):

{
  "mcpServers": {
    "one-mcp-server": {
      "command": "python",
      "args": [
        "/absolute/path/to/server.py",
        "--transport", "stdio,http",
        "--port", "8004",
        "--storage_path", "tool_embeddings.json"
      ]
    }
  }
}

๐Ÿ—๏ธ Architecture

Components

  • server.py: Entry point that initializes the app and starts the MCP server
  • mcp_server.py: Handles multi-transport server orchestration (stdio/HTTP/dual)
  • api.py: FastAPI application factory and REST endpoint definitions
  • mcp_tools.py: MCP tool decorators and function implementations
  • tools_store.py: Singleton store for tool embeddings with search capability
  • models.py: Pydantic models for type safety and validation
  • config.py: Configuration management and CLI argument parsing
  • logging_setup.py: Centralized logging with rotating file handlers

How It Works

  1. Tool Storage: Tools are stored with their embeddings using sentence-transformers
  2. Semantic Search: Query embeddings are compared using cosine similarity
  3. Persistence: Tools automatically saved to tool_embeddings.json
  4. Dual Interface: Same functionality available via REST API and MCP tools
  5. Multi-Transport: Server can run stdio (for MCP clients) and HTTP simultaneously

Dev

  1. Create zip: zip -r one-mcp.zip . -x "*.git/*" -x ".env" -x ".DS_Store" -x ".dockerignore" -x ".gitignore"

๐Ÿง‘โ€๐Ÿ’ป Contributing

Contributions are welcome! To contribute:

  1. Fork the repository
  2. Create a new feature branch (git checkout -b feature/my-feature)
  3. Commit your changes (git commit -m "Add my feature")
  4. Push to your fork (git push origin feature/my-feature)
  5. Submit a Pull Request

Before submitting, ensure:

  • Code passes linting and basic tests.
  • Youโ€™ve updated documentation if needed.

๐Ÿ“œ License

This project is licensed under the MIT License โ€” see the LICENSE file for details.


๐Ÿ’ฌ Support

If you encounter any issues or have feature requests:

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured