MCP Starter Project logo

MCP Starter Project

by sharmatriloknath

The MCP Starter Project provides a foundation for building AI applications that interact with external tools and APIs using the Model Context Protocol (MCP). It includes both a Python MCP server and a TypeScript/JavaScript MCP client.

View on GitHub

Last updated: N/A

MCP Starter Project

What is MCP?

The Model Context Protocol (MCP) is a standard for building AI applications that can interact with external tools and APIs. It consists of two main components:

  1. MCP Server: A Python service that defines and exposes tools/functions that can be called by AI models
  2. MCP Client: A TypeScript/JavaScript client that connects to the MCP server and manages interactions between AI models and tools

Project Structure

mcp_starter/
├── mcp-server/           # Python MCP server implementation
│   ├── main.py          # Server with documentation search tool
│   └── pyproject.toml   # Python dependencies
└── mcp-clients/         # TypeScript MCP client implementation
    ├── index.ts         # Express server with HuggingFace integration
    └── package.json     # Node.js dependencies

Getting Started

Prerequisites

Setting Up the Server

  1. Create a Python virtual environment and activate it:
cd mcp-server
python -m venv .venv
# On Windows
.venv\Scripts\activate
  1. Install dependencies:
pip install -e .
  1. Create a .env file in the mcp-server directory:
SERPER_API_KEY=your_serper_api_key_here

Setting Up the Client

  1. Install Node.js dependencies:
cd mcp-clients
npm install
  1. Create a .env file in the mcp-clients directory:
HUGGINGFACE_API_KEY=your_huggingface_api_key_here
  1. Build the TypeScript code:
npm run build

Running the Application

  1. Start the MCP server:
cd mcp-server
python main.py
  1. In a new terminal, start the client server:
cd mcp-clients
node build/index.js ../mcp-server/main.py

Using the API

The client exposes two endpoints:

  • Health Check: GET http://localhost:3000/health
  • Chat: POST http://localhost:3000/chat

Example chat request:

{
  "query": "Search the langchain docs for RAG",
  "sessionId": "user123"
}

Features

  • Documentation Search Tool: Search documentation for popular AI libraries:

    • LangChain
    • LlamaIndex
    • OpenAI
  • Conversation Management: Maintains chat history per session

  • Tool Integration: Seamlessly integrates AI model responses with tool calls

  • Error Handling: Robust error handling for API calls and tool execution

How It Works

  1. The MCP server defines tools that can be called by AI models
  2. The client connects to the MCP server and retrieves available tools
  3. When a user sends a query:
    • The client formats the conversation history
    • Sends it to the Hugging Face model
    • Extracts and executes tool calls from the model's response
    • Returns the final response including tool results

Environment Variables

Server

  • SERPER_API_KEY: API key for Google Search functionality

Client

  • HUGGINGFACE_API_KEY: API key for accessing Hugging Face models

License

MIT License