mcd-demo
by jspoelstra
This project demonstrates the creation of simple MCP servers and their integration with a LangChain agent. It provides examples for weather, math, and telemetry servers.
Last updated: N/A
What is mcd-demo?
This project showcases a demonstration of using multiple MCP (Microservice Communication Protocol) servers (weather, math, and telemetry) that are integrated with a LangChain agent. The servers provide specific functionalities that the agent can utilize.
How to use mcd-demo?
To use this demo, first create a virtual environment and install the necessary dependencies from the requirements.txt
file. Then, set the required environment variables for Azure OpenAI API key and endpoint. Optionally, configure MCP server URIs. Start all three MCP servers (weather, math, and telemetry) before starting the agent. Finally, run the agent using python agent.py
.
Key features of mcd-demo
Integration with LangChain agent
Example MCP servers (weather, math, telemetry)
Docker support for math server
Configurable environment variables
Simple server implementations
Use cases of mcd-demo
Demonstrating LangChain agent interaction with external services
Building AI applications that require real-time data
Creating modular and scalable AI systems
Testing and prototyping MCP server integrations
Learning about LangChain and MCP server architectures
FAQ from mcd-demo
What are MCP servers?
What are MCP servers?
MCP servers are microservices that communicate using a specific protocol. In this demo, they provide functionalities like weather information, mathematical calculations, and telemetry data.
What is LangChain?
What is LangChain?
LangChain is a framework for developing applications powered by language models. It allows you to connect language models to other sources of data and computation.
How do I get the Azure OpenAI API key and endpoint?
How do I get the Azure OpenAI API key and endpoint?
You can obtain the Azure OpenAI API key and endpoint from the Azure AI Foundry where the models are deployed.
Can I run the MCP servers in Docker?
Can I run the MCP servers in Docker?
Yes, the math server can be run in a Docker container. Instructions for building, running, pushing, and logging into the Docker image are provided in the README.
What if I don't want to use all three MCP servers?
What if I don't want to use all three MCP servers?
The agent is likely configured to use all three. You may need to modify the agent.py
file to remove dependencies on specific servers if you don't intend to run them.