MCP Agent - AI Expense Tracker
Enables AI agents to manage personal expenses through natural language conversations. Supports adding, searching, and analyzing transactions with automatic categorization and financial insights.
README
MCP Agent - AI Expense Tracker
A practical demonstration of AI Agent implementation with custom MCP server.
This project showcases how to build intelligent AI agents using the Model Context Protocol (MCP). Through a real-world expense tracking application, you'll see how AI agents can interact with tools, databases, and APIs to perform complex tasks through natural conversation.
🎯 What This Demonstrates
- Custom MCP Server: Build your own MCP server using FastAPI
- AI Agent Integration: Connect AI agents to tools via MCP protocol
- Real-world Application: Practical expense tracking use case
- Natural Language Interface: Chat with AI to manage your data
- Tool Discovery: AI automatically discovers and uses available tools
🏗️ Architecture
graph TB
subgraph "Backend"
API[API Server<br/>FastAPI + SQLite<br/>Port: 8002]
MCP[MCP Server<br/>FastAPI-MCP<br/>Port: 9002]
end
subgraph "AI Layer"
Agent[AI Agent<br/>Agno Framework<br/>Port: 7777]
LLM[LLM]
end
subgraph "Client Layer"
UI[Web UI<br/>Next.js + React<br/>Port: 3000]
Telegram[Telegram Bot<br/>Python Telegram Bot]
AnyClient[Any MCP Client]
end
subgraph "End Users"
User1[User]
User2[User]
User3[User]
end
API -->|Exposes REST API| MCP
MCP -->|MCP Protocol| Agent
MCP -.->|MCP Protocol| AnyClient
Agent -->|API Calls| LLM
Agent -->|Serves| UI
Agent -->|Serves| Telegram
UI -->|Interacts| User1
Telegram -->|Interacts| User2
AnyClient -.->|Interacts| User3
style API fill:#0066CC,color:#fff
style MCP fill:#00AA66,color:#fff
style Agent fill:#FF6600,color:#fff
style LLM fill:#8B5CF6,color:#fff
style UI fill:#06B6D4,color:#fff
style Telegram fill:#06B6D4,color:#fff
📖 For detailed architecture documentation, request flows, and deployment options, see ARCHITECTURE.md
🚀 Features
- AI-Powered Agent: Natural language expense tracking using OpenAI GPT-4
- SQLite Database: Persistent storage for all transactions
- Auto-Initialization: Automatic database setup with seed data
- MCP Integration: Extensible tool system for AI agents
- REST API: Full CRUD operations for expense management
- Multiple Clients: Web UI, Telegram bot, and direct agent interface
- Smart Categorization: Automatic expense categorization and insights
- Currency-Agnostic: Clean numerical formatting without currency symbols
📁 Project Structure
MCPAgent/
├── .env.example # Environment variables template
├── .env # Your configuration (create from .env.example)
├── agent/ # AI Agent with Agno framework
│ ├── agent.py # Main agent with system prompts
│ └── agno.db # Agent's SQLite database
├── server/ # FastAPI backend with MCP server
│ ├── main.py # API routes and endpoints
│ ├── store.py # SQLite data store
│ ├── models.py # Pydantic data models
│ ├── config.py # Configuration settings
│ ├── mcp_server.py # MCP protocol server
│ ├── start.py # Server initialization & startup
│ └── expenses.db # Transactions database
└── client/
├── agent-ui/ # Next.js web interface
└── telegram-bot/ # Telegram bot client
🏃 Quick Start
1. Install Dependencies
# Install Python dependencies
pip install -r server/requirements.txt
pip install agno openai python-dotenv
2. Set Environment Variables
# Copy example and add your API key
cp .env.example .env
# Edit .env and add your OPENAI_API_KEY
3. Initialize and Start Servers
cd server
# Check dependencies and initialize database with seed data
python start.py
# Start MCP server (in one terminal)
python start.py --mcp
# Start API server (in another terminal)
python start.py --api
4. Run the AI Agent
cd agent
python agent.py
Access the agent at: http://localhost:7777
💬 Usage Examples
Chat with the AI agent:
- "Add a 50 grocery expense"
- "I spent 75 on dinner last night"
- "How much did I spend on food this month?"
- "Show me my financial summary"
- "What's my biggest expense category?"
- "Add income of 5000 from salary"
🛠️ Tech Stack
- Protocol: Model Context Protocol (MCP) - Custom server implementation
- Agent Framework: Agno
- AI Model: OpenAI GPT-4
- MCP Server: FastAPI-MCP (converts REST API to MCP tools)
- Backend: FastAPI + SQLite
- Frontend: Next.js + React
- Bot: Python Telegram Bot
🔌 How MCP Works Here
- FastAPI Backend (
server/main.py) - Standard REST API with CRUD operations - MCP Server (
server/mcp_server.py) - Wraps the API and exposes it as MCP tools - AI Agent (
agent/agent.py) - Connects to MCP server and automatically discovers tools - Natural Language - User chats with agent, agent uses tools to complete tasks
User Input → AI Agent → MCP Server → FastAPI → SQLite
↓
Tool Selection & Execution
↓
Natural Language Response
📊 API Endpoints
GET /transactions- List all transactionsPOST /transactions- Create new transactionPUT /transactions/{id}- Update transactionDELETE /transactions/{id}- Delete transactionGET /transactions/search?q=- Search transactionsGET /summary- Financial summaryGET /summary/categories- Category breakdownGET /health- Health check
Full API docs: http://localhost:8002/docs
🎯 Server Commands
The start.py script manages server initialization and startup:
# Check dependencies and initialize database
python start.py
# Start MCP server only
python start.py --mcp
# Start API server only
python start.py --api
# Custom ports
python start.py --api --port 8000
python start.py --mcp --port 9000
What start.py does:
- ✅ Checks all required dependencies
- ✅ Verifies environment variables
- ✅ Initializes SQLite database
- ✅ Seeds database with sample transactions (first run only)
- ✅ Starts requested server(s)
🤖 Agent Capabilities
The AI agent can:
- Create, read, update, and delete expenses
- Search transactions by keyword
- Generate financial summaries and insights
- Analyze spending patterns by category
- Provide budgeting recommendations
- Filter transactions by date, type, or category
🔧 Configuration
Environment Variables (.env)
OPENAI_API_KEY=your_key_here # Required for AI agent
HOST=localhost # Server host
PORT=8002 # API server port
MCP_HOST=localhost # MCP server host
MCP_PORT=9002 # MCP server port
Server Configuration (server/config.py)
- Server host/port settings
- Database path
- MCP server configuration
Agent Configuration (agent/agent.py)
- AI model selection (default: gpt-4.1)
- System prompt customization
- Agent behavior settings
- Database location
🔍 Troubleshooting
Dependencies missing?
pip install -r server/requirements.txt
pip install agno openai python-dotenv
Database not initialized?
cd server && python start.py
Port already in use?
python start.py --api --port 8003
python start.py --mcp --port 9003
Agent can't connect to MCP?
- Ensure MCP server is running:
python start.py --mcp - Check MCP URL in
agent/agent.py(default: http://localhost:9002/mcp)
📊 Presentation
This project includes a presentation about practical AI agent implementation:
🌐 Live Demo
📁 Local Files
- English Slides: docs/en/index.html
- Russian Slides: docs/ru/index.html
Open the slides to learn more about AI agents and MCP protocol.
📚 Resources
This project is built with and inspired by amazing open-source projects:
- Model Context Protocol (MCP) - Standard protocol for connecting AI agents to tools
- FastAPI-MCP - FastAPI integration for MCP servers
- Agno - Modern framework for building AI agents
- Agent UI - Beautiful chat interface for AI agents
Special thanks to these projects and their maintainers for making AI agent development accessible and enjoyable! 🙏
📝 License
MIT
Built with ❤️ using AI agents and MCP
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.