LangChain Agent with MCP Servers logo

LangChain Agent with MCP Servers

by esakrissa

A LangChain agent that utilizes Model Context Protocol (MCP) adapters for tool integration with various services. It demonstrates how to build an agent that dynamically selects and uses tools based on user queries.

View on GitHub

Last updated: N/A

What is LangChain Agent with MCP Servers?

This project is a LangChain agent that uses Model Context Protocol (MCP) to interact with various services like Tavily Search, a mock weather service, and a math expression evaluator. It leverages LangGraph's ReAct agent pattern for dynamic tool selection.

How to use LangChain Agent with MCP Servers?

To use this agent, clone the repository, create a virtual environment, install dependencies, configure API keys in a .env file, and run the src/agent.py script. The agent will prompt for your query and process it using the appropriate tools.

Key features of LangChain Agent with MCP Servers

  • Graceful Shutdown

  • Subprocess Management

  • Error Handling

  • Modular Design

Use cases of LangChain Agent with MCP Servers

  • Web search and news retrieval

  • Weather information retrieval

  • Mathematical expression evaluation

  • Dynamic tool selection based on user queries

FAQ from LangChain Agent with MCP Servers

What is MCP?

Model Context Protocol (MCP) is used for tool integration with various services.

What services are integrated?

The agent integrates with Tavily Search, a mock weather service, and a math expression evaluator.

How do I add a new MCP server?

Create a new file in src/mcpserver/, implement the server with signal handling, update src/mcpserver/__init__.py, and add the server configuration to src/agent.py.

What is the purpose of graceful shutdown?

Graceful shutdown ensures clean termination by capturing signals, tracking processes, and properly terminating subprocesses.

What API keys are required?

The agent requires OpenAI and Tavily API keys, which should be configured in a .env file.