MCP-APIKit
A microservice control plane server that integrates with Windsurf IDE by fetching API information from Eolink OpenAPI and exposing it as MCP resources, enabling seamless API integration and management in the development environment.
scarqin
README
MCP-APIKit
MCP-APIKit is a microservice control plane (MCP) server designed specifically for Windsurf IDE integration. It fetches API information from Eolink OpenAPI and provides it to the IDE's MCP client, enabling seamless API integration and management within your development environment.
Features
- Connects to Eolink OpenAPI to retrieve API specifications
- Exposes API information as MCP resources
- Provides tools for API discovery and exploration
- Supports API testing and integration within Windsurf IDE
- Implements the Model Context Protocol (MCP) for standardized communication
Installation
# Clone the repository
git clone https://github.com/yourusername/mcp-apikit.git
cd mcp-apikit
# Install dependencies
pnpm install
# Build the project
pnpm run build
Configuration
Create a .env
file in the root directory with the following variables:
EOLINK_API_KEY=your_eolink_api_key
EOLINK_BASE_URL=https://api.eolink.com
SPACE_ID=your_space_id
PROJECT_ID=your_project_id
Usage
Starting the Server
pnpm start
The server will start on the port specified in your .env
file (default: 3000).
Debug
npx @modelcontextprotocol/inspector node dist/index.js
Connecting from Windsurf IDE
pnpm build
In your Windsurf IDE settings, add a new MCP server with the following configuration:
"mcpServers": {
"apikit": {
"command": "node",
"args": [
"/Users/{userName}/Documents/mcp-apikit/dist/index.js"
],
"env": {}
}
}
API Resources
The MCP-APIKit server exposes the following resources:
api://projects
- List all API projectsapi://projects/{projectId}
- Get details for a specific projectapi://projects/{projectId}/apis
- List all APIs in a projectapi://projects/{projectId}/apis/{apiId}
- Get details for a specific API
Tools
The server provides the following tools:
search-apis
- Search for APIs across all projectstest-api
- Test an API endpoint with custom parametersimport-api
- Import an API specification from Eolink to your project
Development
# Run in development mode with hot reloading
npm run dev
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.