
Basic MCP Application
A simple application demonstrating Model Context Protocol (MCP) integration with FastAPI and Streamlit, allowing users to interact with LLMs through a clean interface.
README
Basic MCP Application
A simple app that shows how Model Context Protocol (MCP) works with FastAPI and Gradio (because i am not a dev who enjoys Streamlit headaches).
Overview
This project demonstrates a basic MCP server with a Gradio frontend (Streamlit was a headache, and life's too short for unnecessary pain). Users can chat with AI models through what marketing people would call a "simple interface" and what developers know as "the best I could do before I moveon 🥲 ."
Technology Stack
- Backend: FastAPI + MCP Python SDK (a match made in heaven, unlike pineapple on pizza)
- Frontend: Gradio (because pretty buttons make dopamine go brrr)
- AI Integration: Google Gemini API (not the horoscope sign, the sundar pichai's AI AI AI AI AI AI thingy)
Known Issues
⚠️ Please Note
- The citation tool is not working properly at the moment. You may see errors when trying to analyze paper citations or when using some of the advanced search features. I am working on fixing this issue. When will it be fixed? Who knows ¯\_(ツ)_/¯. Maybe when we have AGI.
- Semantic Scholar API has rate limitations, which may cause the search functionality to sometimes return an error message stating "I cannot directly search for and provide you with papers." This is what happens when free APIs meet enthusiastic users - we love them to death. Just wait a bit and try again (or distract yourself with coffee while the rate limits reset).
Speed Up Your Setup
This project works great with uv
, a super fast Python package installer! Instead of waiting for pip to finish sometime next century, you can use uv
to install dependencies in seconds. I highly recommend this for a smoother experience (and to reclaim hours of your life staring at progress bars).
Quick Start
What You'll Need
- Python 3.11 or newer (sorry dinosaurs still using Python 2)
- pip package manager (or its cooler, faster cousin
uv
) - The patience of a RCB (Google them) (optional but recommended)
Setup Steps
-
Clone this project:
git clone https://github.com/yourusername/basic-mcp-app.git cd basic-mcp-app
-
Create a virtual environment (because global dependencies are the path to emotional damage):
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install required packages:
Using pip (the tortoise way):
pip install -r requirements.txt
Using uv (the hare way that actually wins the race):
# Install uv first if you don't have it curl -LsSf https://astral.sh/uv/install.sh | sh
or
pip install uv
Then install dependencies with uv:
uv pip install -r requirements.txt
Using
uv
makes Python package installation go brrrr! It's much faster than regular pip (motherpromise🤞). In the time it takes pip to realize what you've asked it to do, uv has already finished, made coffee, and started writing your next app for you. -
Set up your API keys (the things you should never commit to GitHub but someone always does anyway):
cp .env.example .env # Open .env and add your API keys
-
Run both servers with one command (like magic, but with more semicolons):
python run.py
This starts both the backend and frontend at once. It's like having your cake and eating it too, but with fewer calories.
You can also start them separately if needed (for the control freaks among us):
- Backend:
uvicorn backend.main:app --reload
- Frontend:
python frontend/app.py
- Backend:
-
Open your web browser and go to http://localhost:8501 (if this doesn't work, try turning it off and on again 😑)
Project Files
Look at this beautiful directory structure that will never stay this clean once development really starts:
basic-mcp-app/
├── .env.example # Template for your API keys(Please don't make your api keys public🙏)
├── .gitignore # Files to ignore in git (like the emotional baggage)
├── README.md # This help file that nobody reads until desperate
├── requirements.txt # Required packages (aka dependency hell)
├── run.py # Script to start both servers
├── backend/
│ └── main.py # Backend server code with MCP (where the real magic happens)
└── frontend/
└── app.py # Gradio frontend interface (pretty buttons go here)
Features
- Scientific paper search using Semantic Scholar (for when Google Scholar is just too mainstream)
- Paper analysis tools (that work 60% of the time, every time)
- Simple chat interface (simple for users, nightmare for developers)
- Easy setup process (if you've ever climbed Everest, this will feel like a walk in the park)
License
MIT (Because I'm nice and don't want to read long licenses either)
Thanks
- Anthropic for MCP - https://www.anthropic.com/news/model-context-protocol
- https://modelcontextprotocol.io/introduction
- Claude for vibe coding the parts of demo, not completely, just tiny bit 🤏.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.