
Compresto MCP
A Model Context Protocol server that provides AI assistants with real-time data about Compresto's file compression app usage statistics, including total users, processed files, and total size reduced.
Tools
get-total-users
Get total users of Compresto
get-total-processed-files
Get total processed files of Compresto
get-total-size-reduced
Get total file size reduced of Compresto
README
Compresto MCP
A Model Context Protocol (MCP) server for Compresto, providing AI assistants with real-time data about Compresto's usage statistics.
<a href="https://glama.ai/mcp/servers/@dqhieu/compresto-mcp"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@dqhieu/compresto-mcp/badge" alt="Compresto MCP server" /> </a>
What is Compresto?
Compresto is a file compression app that helps users reduce file sizes. This MCP server allows AI assistants to access current statistics about Compresto's usage.
What is MCP?
The Model Context Protocol (MCP) is a standard that connects AI systems with external tools and data sources. This MCP server extends AI capabilities by providing access to Compresto's usage statistics.
Installation
git clone https://github.com/dqhieu/compresto-mcp
cd compresto-mcp
npm install
npm run build
Manual Configuration
Add the following to your MCP settings file
{
"mcpServers": {
"compresto": {
"command": "node",
"args": [
"/ABSOLUTE/PATH/TO/PARENT/FOLDER/compresto-mcp/build/index.js"
]
}
}
}
When integrated with compatible AI assistants, this MCP server provides real-time data about Compresto's usage.
Available Tools
The Compresto MCP server provides the following tools:
get-total-users
Returns the total number of Compresto users.
Example response: 12345
get-total-processed-files
Returns the total number of files processed by Compresto.
Example response: Processed 67890 files
get-total-size-reduced
Returns the total amount of file size reduced by Compresto.
Example response: Reduced 1234567890 bytes
Development
Prerequisites
- Node.js (v16 or higher)
- npm or yarn
Project Structure
src/index.ts
- Main entry point containing MCP server implementationpackage.json
- Project dependencies and scriptstsconfig.json
- TypeScript configuration
License
MIT License
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.