Teleprompter
Enables storage and reuse of prompt templates with variable substitution for LLMs. Supports creating, searching, and retrieving prompt templates to avoid repeating complex instructions across conversations.
README
Teleprompter
<div align="center"> <img src="assets/logo.png" alt="Teleprompter Logo" width="200"/> </div>
An MCP server that manages and exposes tools to allow prompt re-use with LLMs.
Table of Contents
- Features
- MCP Configuration
- Usage Examples
- Environment Variables
- Testing
- Contributing
- License
- Acknowledgements
Features
- Prompt Storage & Reuse: Store, search, and retrieve prompt templates for LLMs.
- MCP Server: Exposes prompt tools via the Model Context Protocol (MCP).
- Prompt Variables: Supports template variables (e.g.,
{{name}}) for dynamic prompt generation. - Search: Fast full-text search over stored prompts using MiniSearch.
- TypeScript: Modern, type-safe codebase.
- Extensive Testing: Includes unit and integration tests with Vitest.
MCP Configuration
To use Teleprompter with your LLM client, add this configuration:
{
"mcpServers": {
"teleprompter": {
"command": "npx",
"args": ["-y", "mcp-teleprompter"],
"env": {
"PROMPT_STORAGE_PATH": "/path/to/your/prompts-directory"
}
}
}
}
Note: Replace /path/to/your/prompts-directory with the absolute path where you want prompts stored.
Usage Examples
Once configured, you can use Teleprompter with your LLM by using prompt tags in your conversations. Here's a detailed example that shows how it solves the problem of repeating complex instructions:
π΅ Music Discovery on Spotify
The Problem: Every time you want music recommendations, you have to remind your LLM of all your preferences and constraints:
- "Don't suggest songs I already have in my playlists"
- "Avoid explicit lyrics"
- "Add songs to my queue for review, not directly to playlists"
- "Focus on discovering new artists, not just popular hits"
- "Consider my current activity and mood"
- "Provide brief explanations for why each song fits"
The Solution: Create a prompt that captures all these instructions once.
Creating the prompt: Ask your LLM: "Create a prompt called 'spotify-discover' that helps me find new music with all my specific preferences and workflow requirements."
This creates a comprehensive template like:
I'm looking for music recommendations for Spotify based on:
**Current mood:** {{mood}}
**Activity/setting:** {{activity}}
**Preferred genres:** {{genres}}
**Recent artists I've enjoyed:** {{recent_artists}}
**Important constraints:**
- DO NOT suggest songs I already have in my existing playlists
- Avoid explicit lyrics (clean versions only)
- Focus on discovering new/lesser-known artists, not just popular hits
- Provide 5-7 song recommendations maximum
**Workflow:**
- Add recommendations to my Spotify queue (not directly to playlists)
- I'll review and save the ones I like to appropriate playlists later
**For each recommendation, include:**
- Artist and song name
- Brief explanation (1-2 sentences) of why it fits my current mood/activity
- Similar artists I might also enjoy
Please help me discover music that matches this vibe while following these preferences.
Using it:
>> spotify-discover
Now you just fill in your current mood and activity, and get perfectly tailored recommendations that follow all your rulesβno need to repeat your constraints every time.
π Other Common Use Cases
π Work Ticket Management
- Create prompts for JIRA/Linear ticket formatting with your team's specific requirements
- Include standard fields, priority levels, acceptance criteria templates
- Avoid repeating your company's ticket standards every time
π§ Email Templates
- Customer support responses with your company's tone and required disclaimers
- Follow-up sequences that match your communication style
- Automated inclusion of signatures, links, and standard information
π Code Review Guidelines
- Technical review checklists with your team's specific standards
- Security considerations and performance criteria
- Documentation requirements and testing expectations
The common thread: stop repeating yourself. If you find yourself giving the same detailed instructions to your LLM repeatedly, create a prompt for it.
π Discovering Existing Prompts
You can search your prompt library:
Can you search my prompts for "productivity" or "task management"?
Or list all available prompts:
What prompts do I have available?
βοΈ Manual Editing
Prompts are stored as simple markdown files in your PROMPT_STORAGE_PATH directory. You can also create and edit them directly with your favorite text editor:
- Each prompt is saved as
{id}.mdin your prompts directory - Use
{{variable_name}}syntax for template variables - Standard markdown formatting is supported
- File changes are automatically picked up by the server
π‘ Best Practices
-
Use descriptive IDs: Choose prompt IDs that clearly indicate their purpose (e.g.,
meeting-notes,code-review-checklist) -
Include helpful variables: Use
{{variable_name}}for dynamic content that changes each time you use the prompt -
Organize by category: Consider using prefixes like
task-,content-,analysis-to group related prompts
Testing
Run all tests:
npm test
Run tests with coverage:
npm run test:coverage
Tests are written with Vitest. Coverage reports are generated in the coverage/ directory.
Contributing
Contributions are welcome! Please:
- Follow the existing code style (see
.prettierrc.jsonand.eslintrc.mjs). - Add tests for new features or bug fixes.
License
This project is licensed under the MIT License. See LICENSE for details.
Acknowledgements
Made with β€οΈ by John Anderson
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.