mcp-server-collector MCP server
A MCP Server used to collect MCP Servers over the internet.
chatmcp
Tools
extract-mcp-servers-from-content
Extract MCP Servers from given content
extract-mcp-servers-from-url
Extract MCP Servers from a URL
submit-mcp-server
Submit MCP Server to MCP Servers Directory like mcp.so
README
mcp-server-collector MCP server
A MCP Server used to collect MCP Servers over the internet.
Components
Resources
No resources yet.
Prompts
No prompts yet.
Tools
The server implements 3 tools:
- extract-mcp-servers-from-url: Extracts MCP Servers from given URL.
- Takes "url" as required string argument
- extract-mcp-servers-from-content: Extracts MCP Servers from given content.
- Takes "content" as required string argument
- submit-mcp-server: Submits a MCP Server to the MCP Server Directory like mcp.so.
- Takes "url" as required string argument and "avatar_url" as optional string argument
Configuration
.env file is required to be set up.
OPENAI_API_KEY="sk-xxx"
OPENAI_BASE_URL="https://api.openai.com/v1"
OPENAI_MODEL="gpt-4o-mini"
MCP_SERVER_SUBMIT_URL="https://mcp.so/api/submit-project"
Quickstart
Install
Claude Desktop
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
<details> <summary>Development/Unpublished Servers Configuration</summary>
"mcpServers": {
"fetch": {
"command": "uvx",
"args": ["mcp-server-fetch"]
},
"mcp-server-collector": {
"command": "uv",
"args": [
"--directory",
"path-to/mcp-server-collector",
"run",
"mcp-server-collector"
],
"env": {
"OPENAI_API_KEY": "sk-xxx",
"OPENAI_BASE_URL": "https://api.openai.com/v1",
"OPENAI_MODEL": "gpt-4o-mini",
"MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project"
}
}
}
</details>
<details> <summary>Published Servers Configuration</summary>
"mcpServers": {
"fetch": {
"command": "uvx",
"args": ["mcp-server-fetch"]
},
"mcp-server-collector": {
"command": "uvx",
"args": [
"mcp-server-collector"
],
"env": {
"OPENAI_API_KEY": "sk-xxx",
"OPENAI_BASE_URL": "https://api.openai.com/v1",
"OPENAI_MODEL": "gpt-4o-mini",
"MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project"
}
}
}
</details>
Development
Building and Publishing
To prepare the package for distribution:
- Sync dependencies and update lockfile:
uv sync
- Build package distributions:
uv build
This will create source and wheel distributions in the dist/
directory.
- Publish to PyPI:
uv publish
Note: You'll need to set PyPI credentials via environment variables or command flags:
- Token:
--token
orUV_PUBLISH_TOKEN
- Or username/password:
--username
/UV_PUBLISH_USERNAME
and--password
/UV_PUBLISH_PASSWORD
Debugging
Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via npm
with this command:
npx @modelcontextprotocol/inspector uv --directory path-to/mcp-server-collector run mcp-server-collector
Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.
Community
About the author
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.