Petclinic MCP Server
Enables interaction with the Swagger Petstore API (Petclinic v2) to fetch and manage pet information by status (available, pending, sold).
README
petclinic-mcp
Petclinic MCP server
Petclinic MCP server uses petclinic v2 apis (https://petstore.swagger.io/). It interacts with the Swagger Petstore API (similar to a "PetClinic") and exposes tools to be used by OpenAI models.
It exposes following capabilites
- fetch_petsByStatus: Available status values : available, pending, sold

Prerequisites
- uv package manager
- Python
Running locally
- tip use stdio transport to avoid remote server setup. Change petclinic_mcp_server.py line 39 to use stdio transport
mcp.run(transport='stdio')
- Clone the project, navigate to the project directory and initiate it with uv:
uv init
- Create virtual environment and activate it:
uv venv
source .venv/bin/activate
- Install dependencies:
uv add mcp httpx
- Launch the mcp inspector
npx @modelcontextprotocol/inspector uv run petclinic_mcp_server.py
- OR launch the mcp server without inspector
uv run petclinic_mcp_server.py
Configuration for Claude Desktop
You will need to supply a configuration for the server for your MCP Client. Here's what the configuration looks like for claude_desktop_config.json:
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/{your-project-path}/petclinic-mcp/"
]
},
"research": {
"command": "/{your-uv-install-path}/uv",
"args": [
"--directory",
"/{your-project-path}/petclinic-mcp/",
"run",
"petclinic_mcp_server.py"]
},
"fetch": {
"command": "uvx",
"args": ["mcp-server-fetch"]
}
}
}
Deploy to Cloud Foundry
- tip use sse transport to deploy petclinic mcp server as a remote server. Change petclinic_mcp_server.py line 39 to use stdio transport
mcp.run(transport='sse')
- Login to your Cloud Foundry account and push the application
cf push -f manifest.yml
Binding to MCP Agents
Model Context Protocol (MCP) servers are lightweight programs that expose specific capabilities to AI models through a standardized interface. These servers act as bridges between LLMs and external tools, data sources, or services, allowing your AI application to perform actions like searching databases, accessing files, or calling external APIs without complex custom integrations.
Create a user-provided service that provides the URL for an existing MCP server:
cf cups petclinic-mcp-server -p '{"mcpServiceURL":"https://your-petclinic-mcp-server.example.com"}'
Bind the MCP service to your application:
cf bind-service ai-tool-chat petclinic-mcp-server
Restart your application:
cf restart ai-tool-chat
Your chatbot will now register with the research MCP agent, and the LLM will be able to invoke the agent's capabilities when responding to chat requests.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.