Grant Hunter MCP

Grant Hunter MCP

Autonomously discovers non-dilutive funding opportunities from Grants.gov, generates AI-powered pitch drafts using Gemini, and integrates with Google Workspace to create email drafts and calendar reminders for grant deadlines.

Category
Visit Server

README

StartupFundingAgent - Production-Grade MCP

From Zero to Funding Pitch in 60 Seconds

StartupFundingAgent (also known as Grant Hunter MCP) is an enterprise-ready Model Context Protocol (MCP) server designed to autonomously hunt for non-dilutive funding, generate winning pitches using advanced AI frameworks, and seamlessly integrate with Google Workspace for execution.


๐Ÿ“– Table of Contents


๐Ÿš€ Core Features

This MCP exposes three powerful, production-hardened endpoints:

1. /query_grants - Intelligent Grant Discovery

  • Real-Time Search: Direct integration with Grants.gov API.
  • Smart Filtering: Deduplicates and sorts opportunities by deadline.
  • Keyword-Based Discovery: Search funding opportunities with flexible keyword matching.
  • Resilient: Implements a 5x Retry Policy with Exponential Backoff to handle government API instability.

2. /generate_pitch - AI-Powered Pitch Architect

  • Gemini Integration: Leverages Google's Gemini 2.0 Flash for high-speed, high-quality generation.
  • 150-Word Precision: Generates compelling, concise funding pitches optimized for grant applications.
  • Triple-Horizon Framework: Enforces a strict prompt structure (Acute Pain Point, Technical Deviation, Macro-Economic Lock) to maximize scoring potential.
  • Graceful Fallback: Automatic template fallback ensures business continuity even if AI services are disrupted.

3. /manage_google_services - Secure Execution

  • Gmail Integration: Auto-drafts personalized emails to grant officers.
  • Calendar Sync: Automatically adds hard deadlines to your Google Calendar.
  • Least Privilege: Operates with ephemeral OAuth tokens passed securely at runtime.

๐Ÿ—๏ธ Architectural Excellence

We have evolved the legacy Flask MVP into a Containerized FastAPI MCP Server, representing a paradigm shift in reliability and scalability.

  • Microservices-Ready: Stateless architecture designed for orchestration.
  • Type-Safe: Fully typed Python codebase for maintainability.
  • Dockerized: "Write Once, Run Anywhere" deployment.

๐Ÿ›ก๏ธ Security & Resilience Pillars

We treat security and reliability as first-class citizens, not afterthoughts.

1. Lethal Trifecta Mitigation (Security)

  • Zero Hardcoded Secrets: All API keys and Client IDs are sourced strictly from os.environ.
  • Ephemeral Tokens: OAuth tokens are consumed via request body and never stored persistently.
  • Secure Configuration: Comprehensive .gitignore ensures no secrets are committed.

2. Network Resilience (Reliability)

  • Production AgentOps Standard: We overrode the legacy MAX_RETRY_ATTEMPTS=2 policy.
  • 5x Retry Loop: All external API calls (Grants.gov, Google Services) implement a robust 5-attempt retry mechanism with exponential backoff to survive transient network failures (5xx/429).

3. Input Validation (Safety)

  • Strict Pydantic Schemas: Every endpoint is protected by rigorous data models (GrantsQueryInput, PitchGenerateInput, etc.).
  • Injection Prevention: Validated inputs prevent XSS and injection attacks before they reach business logic.

4. Logging Hygiene

  • No PII in Logs: Email bodies and pitch drafts are never logged.
  • Configurable Log Level: Set LOG_LEVEL=INFO for production; DEBUG for development.

๐Ÿ› ๏ธ Technical Stack

  • Runtime: Python 3.11 (Slim Docker Image)
  • Framework: FastAPI (High-performance Async I/O)
  • Server: Uvicorn (Standard ASGI)
  • AI: Google Generative AI (Gemini 2.0 Flash)
  • Integration: Google API Client (Gmail, Calendar)
  • Validation: Pydantic v2

โšก Setup Instructions

Get the agent running in seconds.

Prerequisites

  • Docker (recommended) OR Python 3.11+
  • A Gemini API key (get one at Google AI Studio)
  • (Optional) Google OAuth credentials for Google Services integration

1. Configure Environment

# Copy the example environment file
cp .env.example .env

# Edit .env with your API keys
nano .env

Required variables:

  • GEMINI_API_KEY: Your Google Gemini API key

2. Build the Container

docker build -t grant-hunter-mcp .

3. Run the Agent

docker run -p 8080:8080 --env-file .env grant-hunter-mcp

4. Verify

Access the auto-generated OpenAPI documentation: http://localhost:8080/docs

Alternative: Run Without Docker

# Install dependencies
pip install -r requirements.txt

# Run with Uvicorn
uvicorn main:app --reload --host 0.0.0.0 --port 8000

The server will be available at http://localhost:8000.


๐Ÿ“š API Reference

Health Check

GET /health

Returns server health status.


POST /query_grants

Search for grant opportunities.

Request Body:

{
  "keyword": "clean energy",
  "max_results": 20,
  "focus_area": "renewable energy"
}

Response:

{
  "results": [
    {
      "id": "DE-FOA-0003001",
      "title": "AI-Driven Clean Energy Optimization SBIR",
      "agency": "Department of Energy",
      "close_date": "December 15, 2025",
      "status": "Open",
      "data_status": "COMPLETE"
    }
  ],
  "total_count": 1,
  "execution_time_ms": 1250.5
}

POST /generate_pitch

Generate an AI-powered funding pitch.

Request Body:

{
  "startup_name": "CleanTech Solutions",
  "focus_area": "Renewable Energy",
  "grant_title": "Clean Energy Innovation Grant"
}

Response:

{
  "pitch_draft": "...",
  "model_used": "gemini-2.0-flash",
  "status": "SUCCESS"
}

POST /manage_google_services

Create Gmail draft and Calendar event for grant deadlines.

Request Body:

{
  "grant_title": "Clean Energy Innovation Grant",
  "deadline_date": "December 15, 2025",
  "oauth_token": "your_oauth_access_token"
}

Response:

{
  "gmail_status": "SUCCESS",
  "calendar_status": "SUCCESS",
  "draft_link": "https://mail.google.com/...",
  "event_link": "https://calendar.google.com/...",
  "errors": []
}

๐Ÿ” Environment Variables

Variable Required Description Default
GEMINI_API_KEY Yes Google Gemini API key -
GEMINI_MODEL No Gemini model to use gemini-2.0-flash
LOG_LEVEL No Logging level INFO
DEMO_MODE No Enable demo mode (skips real API calls) FALSE

See .env.example for a complete list of available variables.


๐Ÿ“ Project Structure

mcp/
โ”œโ”€โ”€ main.py                     # FastAPI application entry point
โ”œโ”€โ”€ grants_gov_api.py           # Grants.gov API integration
โ”œโ”€โ”€ pitch_generator.py          # AI pitch generation with Gemini
โ”œโ”€โ”€ google_services_manager.py  # Gmail and Calendar integration
โ”œโ”€โ”€ pydantic_models.py          # Input/output data models
โ”œโ”€โ”€ mcp_definition.yaml         # MCP server definition
โ”œโ”€โ”€ requirements.txt            # Python dependencies
โ”œโ”€โ”€ .env.example                # Environment variables template
โ””โ”€โ”€ README.md                   # This file

๐Ÿ”Œ MCP Integration

This server follows the Model Context Protocol specification. Use the mcp_definition.yaml file to configure your MCP client.

Using with Claude Desktop

  1. Update mcp_definition.yaml with your server URL
  2. Add the MCP server to your Claude Desktop configuration
  3. Start using grant discovery and pitch generation in conversations

๐Ÿงช Development

Running Tests

[!WARNING] The tests directory is currently pending implementation. Please refer to TODO.md for the roadmap on adding unit and integration tests.

# Future command
# pytest tests/ -v

Linting

flake8 . --max-line-length=79
mypy . --strict

Security Notes

  • Never commit .env files - Contains sensitive API keys
  • OAuth tokens are ephemeral - Passed at runtime, never stored
  • All inputs validated - Using Pydantic models with strict validation
  • No hardcoded secrets - All credentials loaded from environment variables

๐Ÿค Contributing

  1. Check TODO.md for prioritized tasks
  2. Follow the existing code style
  3. Ensure all tests pass before submitting PRs
  4. Never commit secrets or API keys

๐Ÿ”ฎ V2 Scope (Future Roadmap)

While this MVP delivers a complete "Grant Hunter" loop, our vision extends further:

  • Advanced UI: React/Next.js dashboard for visual pipeline management.
  • Team Collaboration: Multi-user support with role-based access control (RBAC).
  • Analytics Engine: Dashboard for tracking win rates and funding funnel metrics.
  • Full OAuth2 Flow: Implementing a dedicated auth service for token lifecycle management.
  • Async Network Layer: Migration from requests to httpx planned for V2 to handle >10k concurrent connections (currently optimized for single-tenant stability).
  • Brazil Adaptation: Support for Brazilian grant sources (Transferegov, etc.)

๐Ÿ“„ License

MIT License - See LICENSE file for details.


Built with โค๏ธ for founders who are building the future.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured