IdeaLift MCP Server
re-backlog idea management with decision tracking, signal aggregation, and RICE scoring. Captures product feedback from Slack, Teams, Discord, and GitHub
README
IdeaLift MCP Server
Decision intelligence for AI assistants via Model Context Protocol.
Connect Claude, ChatGPT, and other AI assistants to your product backlog. Capture feedback, track decisions, and manage ideas without leaving your AI workflow.
What it does
- Normalize ideas — Transform raw text into structured feature requests, bug reports, or tasks
- Signal aggregation — Capture product feedback from Slack, Teams, Discord, and GitHub
- Decision tracking — Record who decided what, when, and why with full audit trails
- RICE scoring — Prioritize ideas with Reach, Impact, Confidence, and Effort scores
- Ticket creation — Push structured ideas to GitHub Issues, Jira, or Linear
MCP Tools
| Tool | Auth Required | Description |
|---|---|---|
normalize_idea |
No | Transform raw text into a structured idea |
check_auth |
No | Check if user is linked to IdeaLift |
list_destinations |
Yes | Get available GitHub/Jira/Linear projects |
create_ticket |
Yes | Create a ticket in a destination system |
connect_destination |
Yes | Connect a new destination |
Plus proxy tools for full IdeaLift API access (ideas, signals, decisions, roadmap).
Setup
1. Install dependencies
npm install
2. Configure environment
cp .env.example .env
Required environment variables:
| Variable | Description |
|---|---|
IDEALIFT_APP_URL |
IdeaLift API base URL |
INTERNAL_API_KEY |
Service-to-service auth key |
DATABASE_HOST |
SQL Server host |
DATABASE_NAME |
Database name |
DATABASE_USERNAME |
Database user |
DATABASE_PASSWORD |
Database password |
OPENAI_API_KEY |
For idea normalization |
3. Build and run
npm run build
npm start
Or for development:
npm run dev
The MCP server starts on port 3001 (configurable via PORT).
4. Connect to your AI assistant
SSE endpoint: http://localhost:3001/mcp
Add this URL as an MCP server in Claude Desktop, ChatGPT, or any MCP-compatible client.
Transport
Uses Server-Sent Events (SSE) transport with automatic keep-alive pings every 15 seconds.
OAuth
Built-in OAuth 2.0 flow for connecting AI assistant users to their IdeaLift accounts. Supports authorization code grant with PKCE.
Deployment
Deployed as an Azure App Service. See the IdeaLift docs for hosted access.
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.