Chatgpt
automateyournetwork
README
🧠 Ask ChatGPT - MCP Server (Stdio)
This is a Model Context Protocol (MCP) stdio server that forwards prompts to OpenAI’s ChatGPT (GPT-4o). It is designed to run inside LangGraph-based assistants and enables advanced summarization, analysis, and reasoning by accessing an external LLM.
📌 What It Does
This server exposes a single tool:
{
"name": "ask_chatgpt",
"description": "Sends the provided text ('content') to an external ChatGPT (gpt-4o) model for advanced reasoning or summarization.",
"parameters": {
"type": "object",
"properties": {
"content": {
"type": "string",
"description": "The text to analyze, summarize, compare, or reason about."
}
},
"required": ["content"]
}
}
Use this when your assistant needs to:
Summarize long documents
Analyze configuration files
Compare options
Perform advanced natural language reasoning
🐳 Docker Usage
Build and run the container:
docker build -t ask-chatgpt-mcp .
docker run -e OPENAI_API_KEY=your-openai-key -i ask-chatgpt-mcp
🧪 Manual Test
Test the server locally using a one-shot request:
echo '{"method":"tools/call","params":{"name":"ask_chatgpt","arguments":{"content":"Summarize this config..."}}}' | \
OPENAI_API_KEY=your-openai-key python3 server.py --oneshot
🧩 LangGraph Integration
To connect this MCP server to your LangGraph pipeline, configure it like this:
("chatgpt-mcp", ["python3", "server.py", "--oneshot"], "tools/discover", "tools/call")
⚙️ MCP Server Config Example
Here’s how to configure the server using an mcpServers JSON config:
{
"mcpServers": {
"chatgpt": {
"command": "python3",
"args": [
"server.py",
"--oneshot"
],
"env": {
"OPENAI_API_KEY": "<YOUR_OPENAI_API_KEY>"
}
}
}
}
🔍 Explanation
"command": Runs the script with Python
"args": Enables one-shot stdin/stdout mode
"env": Injects your OpenAI key securely
🌍 Environment Setup
Create a .env file (auto-loaded with python-dotenv) or export the key manually:
OPENAI_API_KEY=your-openai-key
Or:
export OPENAI_API_KEY=your-openai-key
📦 Dependencies
Installed during the Docker build:
openai
requests
python-dotenv
📁 Project Structure
.
├── Dockerfile # Docker build for the MCP server
├── server.py # Main stdio server implementation
└── README.md # You're reading it!
🔐 Security Notes
Never commit .env files or API keys.
Store secrets in secure environment variables or secret managers.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor
Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.
Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.