mcd-demo
by jspoelstra
This project demonstrates the creation of simple MCP servers and their integration with a LangChain agent. It provides examples for weather, math, and telemetry servers.
Last updated: N/A
mcd-demo
Testing creation of simple MCP servers and integrating with LangChain agent
Prerequisites
Create a virtual environment
python3 -m venv venv
source venv/bin/activate
Install dependencies
pip install -r requirements.txt
Set Environment variables
Set the following environment variables. Get the values from Azure AI Foundry where the models are deployed:
export AZURE_OPENAI_API_KEY=<your_azure_openai_api_key>
export AZURE_OPENAI_ENDPOINT=<your_azure_openai_endpoint>
Optionally, you can set the following environment variables to configure the MCP servers:
export MCP_MATH_URI=http://<server-uri>:5001/sse
Note: You can also set these variables in a
.env
file in the root directory of the project.
Running the agent
Start the MCP servers
You have to start all three MCP servers before starting the agent. Each server listens on a separate port. You can start them in separate terminals or run them in the background. To run in the background, do the following:
python weather_server.py &
python math_server.py &
python telemetry_server.py &
Alternatively, you can run the math server in a Docker container. To do this, first build the Docker image:
make build
Then, run the container:
make run-local
If you want to push the Docker image to a registry, tag and push it using the following commands:
make login
make push
Start the agent
python agent.py
Killing the MCP servers
pkill -9 -f weather_server.py
pkill -9 -f math_server.py
pkill -9 -f telemetry_server.py