RL-MCP

RL-MCP

A Model Context Protocol server that provides AI models with structured access to external data and services, acting as a bridge between AI assistants and applications, databases, and APIs in a standardized, secure way.

Category
Visit Server

README

🚀 RL-MCP: Ryan's Model Context Protocol Server

Python FastAPI PostgreSQL Docker

🎯 A powerful, scalable Model Context Protocol (MCP) server built with modern Python technologies

🌟 What is RL-MCP?

RL-MCP is a robust Model Context Protocol server designed to provide AI models with structured access to external data and services. Think of it as a bridge 🌉 that allows AI assistants to interact with your applications, databases, and APIs in a standardized, secure way.

🎪 Current Features

  • 🔐 Secure Authentication - Built-in auth system to protect your endpoints
  • 📊 RESTful API - Clean, well-documented API endpoints with FastAPI
  • 🗄️ PostgreSQL Integration - Robust database layer with SQLModel/SQLAlchemy
  • 🐳 Docker Ready - Fully containerized development and deployment
  • 🔄 Database Migrations - Alembic-powered schema management
  • 📈 Health Monitoring - Built-in health checks and connection monitoring
  • 🎨 Interactive Docs - Auto-generated API documentation
  • 🛠️ Development Tools - Pre-commit hooks, linting, and formatting

📈 Stock Market Intelligence

🚀 Transform your applications with AI-powered financial intelligence

RL-MCP includes a comprehensive Stock Market Intelligence API that combines cutting-edge AI with real-time financial data:

🧠 AI-Powered Capabilities

  • 🔍 Vector Search: Semantic search across news, analysis, and market data using advanced NLP
  • 📊 Sentiment Analysis: Real-time sentiment scoring for news and market content
  • 🤖 Smart Analysis: AI-driven stock analysis with confidence scoring and recommendations
  • 🎯 Relevance Scoring: Intelligent content ranking and filtering

💹 Real-Time Market Data

  • 📈 Live Pricing: Current stock prices with change indicators and market metrics
  • 📰 News Intelligence: Latest financial news with sentiment analysis from multiple sources
  • 🌍 Market Overview: Comprehensive market summaries with top movers and trends
  • 🔥 Trending Analysis: Most active and discussed stocks based on data volume

High-Performance Architecture

  • 🚀 Intelligent Caching: Multi-layer caching for lightning-fast responses
  • 🔄 Background Processing: Async data ingestion and processing
  • 📊 Performance Monitoring: Built-in health checks and cache statistics
  • 🛡️ Enterprise-Ready: Secure, scalable, and production-ready

🎯 Use Cases

  • 🤖 AI Trading Assistants - Portfolio analysis and trading signals
  • 📊 Financial Research - Market research and competitive intelligence
  • 📱 Investment Apps - Smart notifications and educational content
  • 🏢 Enterprise Systems - Risk management and client reporting

📚 Comprehensive Documentation

Explore our detailed stock market API documentation:

🚀 Future Vision

This MCP server is designed to be the foundation for AI-powered applications that need:

  • 🤖 AI Model Integration - Seamless connection between AI models and your data
  • 🔌 Plugin Architecture - Extensible system for adding new capabilities
  • 📡 Real-time Communication - WebSocket support for live data streaming
  • 🌐 Multi-tenant Support - Serve multiple clients with isolated data
  • 🔍 Advanced Search - Vector search and semantic querying capabilities
  • 📊 Analytics Dashboard - Monitor usage, performance, and insights

🛠️ Technology Stack

  • 🐍 Backend: Python 3.12 + FastAPI
  • 🗄️ Database: PostgreSQL with SQLModel
  • 🐳 Containerization: Docker + Docker Compose
  • 🔄 Migrations: Alembic
  • 🧪 Code Quality: Black, isort, pylint, pre-commit hooks
  • 📚 Documentation: Auto-generated OpenAPI/Swagger docs
  • 🧠 AI/ML: Sentence Transformers, Vector Search, Sentiment Analysis

🚀 Quick Start

Prerequisites

  • 🐳 Docker and Docker Compose
  • 🐍 Python 3.12+ (for local development)
  • 🍺 Homebrew (macOS) or equivalent package manager

🎯 One-Command Setup

Get up and running in seconds! Our setup script handles everything:

make setup-environment

This magical command will:

  • 🔧 Install all required dependencies
  • 🐍 Create and configure a Python virtual environment
  • 🐳 Set up Docker containers
  • 📦 Install all Python packages
  • ✅ Verify everything is working

🏃‍♂️ Running the Application

🐳 Docker Development (Recommended)

# Build and start all services
make up

# Or run in background
docker compose up -d

Your services will be available at:

  • 🌐 API Server: http://localhost:8000
  • 📚 API Docs: http://localhost:8000/docs
  • 📈 Stock API: http://localhost:8000/v1/stock
  • 🗄️ Database Admin: http://localhost:8080 (Adminer)

🐍 Local Development

# Activate virtual environment
source venv/bin/activate

# Start the api and db at port 8000
make up

📖 API Documentation

Once running, explore the interactive API documentation:

  • 📊 Swagger UI: http://localhost:8000/docs
  • 📋 ReDoc: http://localhost:8000/redoc
  • 🔍 OpenAPI Spec: http://localhost:8000/openapi.json

🔑 Authentication

All API endpoints require authentication. Include your auth token in requests:

curl -H "Authorization: Bearer YOUR_TOKEN" http://localhost:8000/v1/item

📈 Stock API Quick Example

# Search for Tesla battery technology insights
curl -X POST "http://localhost:8000/v1/stock/search" \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "query": "Tesla battery technology innovations",
    "symbols": ["TSLA"],
    "similarity_threshold": 0.8,
    "limit": 10
  }'

# Get current Apple stock price
curl -H "Authorization: Bearer YOUR_TOKEN" \
     "http://localhost:8000/v1/stock/price/AAPL"

# Get market summary
curl -H "Authorization: Bearer YOUR_TOKEN" \
     "http://localhost:8000/v1/stock/market/summary"

🗄️ Database Management

🔄 Creating Migrations

When you modify database models:

MSG="Add new awesome feature" make migration

🏗️ Database Commands

# Start the api and db at port 8000
make up

# Check database health
curl http://localhost:8000/health

🛠️ Development Workflow

📦 Managing Dependencies

# Regenerate requirements.txt with latest versions
make regen-requirements

🧹 Cleanup

# Remove all containers and volumes
make clean

🔍 Code Quality

Pre-commit hooks automatically run:

  • 🎨 Black - Code formatting
  • 📋 isort - Import sorting
  • 🔍 Pylint - Code linting

🏗️ Project Structure

rl-mcp/
├── 📁 app/                    # Main application code
│   ├── 📁 api/               # API layer
│   │   └── 📁 v1/           # API version 1
│   │       ├── 📁 base/     # Base models and tables
│   │       ├── 📁 item/     # Item management endpoints
│   │       └── 📁 stock/    # 📈 Stock market intelligence
│   │           ├── 📁 services/  # AI services (vector search, market data)
│   │           ├── 📄 routes_stock.py     # Stock API endpoints
│   │           ├── 📄 models_stock.py     # Data models
│   │           └── 📄 controllers_stock.py # Business logic
│   ├── 📁 databases/        # Database configuration
│   └── 📄 main.py          # Application entry point
├── 📁 docs/                 # 📚 Comprehensive documentation
│   └── 📁 stock/           # Stock API documentation
├── 📁 docker/               # Docker configurations
├── 📁 migrations/           # Database migrations
├── 📁 scripts/             # Utility scripts
├── 📁 utilities/           # Helper utilities
└── 📄 Makefile            # Development commands

🤝 Contributing

We welcome contributions! 🎉

  1. 🍴 Fork the repository
  2. 🌿 Create a feature branch
  3. ✨ Make your changes
  4. 🧪 Run tests and linting
  5. 📝 Submit a pull request

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🆘 Support

Having issues? 🤔


<div align="center">

🚀 Built with ❤️ for the future of AI-powered applications

Ready to revolutionize how AI models interact with your data? Let's build something amazing together!

📈 Featuring comprehensive stock market intelligence with AI-powered semantic search, real-time data, and intelligent caching 🤖💹

</div>

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured