MCP Proxy Pydantic Agent logo

MCP Proxy Pydantic Agent

by p2c2e

This example demonstrates how to integrate Model Context Protocol (MCP) servers with Pydantic.AI. It showcases the use of a proxy agent to interact with MCP servers using Pydantic models.

View on GitHub

Last updated: N/A

What is MCP Proxy Pydantic Agent?

This is a sample project demonstrating how to build a proxy agent that interacts with MCP servers using Pydantic.AI. It leverages LLMs for function calling and demonstrates integration with external services like weather and time servers.

How to use MCP Proxy Pydantic Agent?

  1. Clone the repository.
  2. Run uv sync to install dependencies.
  3. Navigate to the mcp-client directory.
  4. Run uv run client.py (requires OpenAI and Anthropic API keys) or uv run client2.py (uses only PydanticAI).
  5. Interact with the agent by asking questions like "What is the time in NY when it is 7:30pm in Bangalore?" or "What is the Weather currently in Chicago?".

Key features of MCP Proxy Pydantic Agent

  • MCP Integration

  • Pydantic.AI Usage

  • LLM Function Calling

  • Proxy Agent Implementation

  • Example Weather Server Interaction

  • Time Server Interaction

Use cases of MCP Proxy Pydantic Agent

  • Building intelligent agents that can interact with external services.

  • Creating a unified interface for accessing multiple MCP servers.

  • Developing custom tools and functions for LLMs.

  • Demonstrating the capabilities of Pydantic.AI in agent development.

  • Prototyping MCP integrations with various LLMs.

FAQ from MCP Proxy Pydantic Agent

What is MCP?

MCP stands for Model Context Protocol. It's a protocol for defining and interacting with models in a standardized way.

What is Pydantic.AI?

Pydantic.AI is a library that allows you to easily create and manage AI agents using Pydantic models.

Why are OpenAI and Anthropic API keys required?

The example code uses two different LLMs (GPT-4o and Sonnet) for demonstration purposes. client.py uses Anthropic libs directly, hence the need for the Anthropic API key.

What is the difference between client.py and client2.py?

client.py uses both OpenAI and Anthropic libraries directly. client2.py uses only PydanticAI and is compatible with any function calling LLM.

How can I modify the code to use different LLMs?

You can modify the code to suit your models by changing the LLM calls in client.py or client2.py to use your desired LLM and API key.