ChatGPT-MCP Server
by automateyournetwork
This is a Model Context Protocol (MCP) stdio server that forwards prompts to OpenAI’s ChatGPT (GPT-4o). It is designed to run inside LangGraph-based assistants and enables advanced summarization, analysis, and reasoning by accessing an external LLM.
Last updated: N/A
🧠 Ask ChatGPT - MCP Server (Stdio)
This is a Model Context Protocol (MCP) stdio server that forwards prompts to OpenAI’s ChatGPT (GPT-4o). It is designed to run inside LangGraph-based assistants and enables advanced summarization, analysis, and reasoning by accessing an external LLM.
📌 What It Does
This server exposes a single tool:
{
"name": "ask_chatgpt",
"description": "Sends the provided text ('content') to an external ChatGPT (gpt-4o) model for advanced reasoning or summarization.",
"parameters": {
"type": "object",
"properties": {
"content": {
"type": "string",
"description": "The text to analyze, summarize, compare, or reason about."
}
},
"required": ["content"]
}
}
Use this when your assistant needs to:
Summarize long documents
Analyze configuration files
Compare options
Perform advanced natural language reasoning
🐳 Docker Usage
Build and run the container:
docker build -t ask-chatgpt-mcp .
docker run -e OPENAI_API_KEY=your-openai-key -i ask-chatgpt-mcp
🧪 Manual Test
Test the server locally using a one-shot request:
echo '{"method":"tools/call","params":{"name":"ask_chatgpt","arguments":{"content":"Summarize this config..."}}}' | \
OPENAI_API_KEY=your-openai-key python3 server.py --oneshot
🧩 LangGraph Integration
To connect this MCP server to your LangGraph pipeline, configure it like this:
("chatgpt-mcp", ["python3", "server.py", "--oneshot"], "tools/discover", "tools/call")
⚙️ MCP Server Config Example
Here’s how to configure the server using an mcpServers JSON config:
{
"mcpServers": {
"chatgpt": {
"command": "python3",
"args": [
"server.py",
"--oneshot"
],
"env": {
"OPENAI_API_KEY": "<YOUR_OPENAI_API_KEY>"
}
}
}
}
🔍 Explanation
"command": Runs the script with Python
"args": Enables one-shot stdin/stdout mode
"env": Injects your OpenAI key securely
🌍 Environment Setup
Create a .env file (auto-loaded with python-dotenv) or export the key manually:
OPENAI_API_KEY=your-openai-key
Or:
export OPENAI_API_KEY=your-openai-key
📦 Dependencies
Installed during the Docker build:
openai
requests
python-dotenv
📁 Project Structure
.
├── Dockerfile # Docker build for the MCP server
├── server.py # Main stdio server implementation
└── README.md # You're reading it!
🔐 Security Notes
Never commit .env files or API keys.
Store secrets in secure environment variables or secret managers.