MCP Server Template (Python)
by Nisarg38
A ready-to-use template for building Model Context Protocol (MCP) servers in Python. This template helps you quickly create servers that can register and expose tools and prompts for AI models to use.
Last updated: N/A
MCP Server Template (Python)
A ready-to-use template for building Model Context Protocol (MCP) servers in Python. This template helps you quickly create servers that can register and expose tools and prompts for AI models to use.
๐ Table of Contents
- Quick Start
- Command Line Options
- Creating Your Own Tools and Prompts
- Project Structure
- Deployment Options
- Development Guide
- Need Help?
๐ Quick Start
Prerequisites
- Python 3.10 or newer
Setup in 3 Easy Steps
1๏ธโฃ Install the package
# Clone the repository
git clone https://github.com/nisarg38/mcp-server-template-python.git my-mcp-server
cd my-mcp-server
# Install in development mode
pip install -e ".[dev]"
2๏ธโฃ Run your server
# Run with Python
python -m src.main
# Or use the convenient CLI
mcp-server-template
3๏ธโฃ Your server is now live!
Access your MCP server at:
- ๐ HTTP: http://localhost:8080
- ๐ป Or use the stdio transport:
mcp-server-template --transport stdio
You'll see log output confirming the server is running successfully.
๐ฎ Command Line Options
Customize your server behavior with these command-line options:
# Change port (default: 8080)
mcp-server-template --port 9000
# Enable debug mode for more detailed logs
mcp-server-template --debug
# Use stdio transport instead of HTTP
mcp-server-template --transport stdio
# Set logging level (options: debug, info, warning, error)
mcp-server-template --log-level debug
๐ ๏ธ Creating Your Own Tools and Prompts
Add a Tool
Tools are functions that AI models can call. To add a new tool:
- Edit
src/main.py
- Add a new function with the
@mcp.tool()
decorator:
@mcp.tool()
def your_tool_name(param1: str, param2: int) -> Dict[str, Any]:
"""
Your tool description - this will be shown to the AI.
Args:
param1: Description of first parameter
param2: Description of second parameter
Returns:
Dictionary with your results
"""
# Your tool logic here
return {"result": "your result"}
Add a Prompt
Prompts are templates that AI models can access:
@mcp.prompt()
def your_prompt_name(param: str) -> str:
"""Your prompt description."""
return f"""
Your formatted prompt with {param} inserted.
Use this for structured prompt templates.
"""
๐ Project Structure
src/ # Source code directory
โโโ main.py # Server entry point with tools & prompts
โโโ config.py # Configuration settings
โโโ utils/ # Utility functions
โโโ tools/ # Tools implementation
โโโ resources/ # Resource definitions
test/ # Tests directory
pyproject.toml # Package configuration
Dockerfile # Docker support
๐ข Deployment Options
Docker Deployment
# Build the Docker image
docker build -t my-mcp-server .
# Run the container
docker run -p 8080:8080 my-mcp-server
Cloud Deployment
This template is designed to work well with various cloud platforms:
- Deploy as a container on AWS, GCP, or Azure
- Run on serverless platforms that support containerized applications
- Works with Kubernetes for orchestration
๐งช Development Guide
# Install development dependencies
pip install -e ".[dev]"
# Run tests
pytest
# Format code
black src test
isort src test
# Run linting
flake8 src test
โ Need Help?
- ๐ MCP Documentation: modelcontextprotocol.io
- ๐ File an issue: GitHub Issues
- ๐ฌ Community: Join our Discord community (link coming soon)
Made with โค๏ธ for the AI developer community