mcpx-py
by dylibso
mcpx-py is a Python library for interacting with Large Language Models (LLMs) using mcp.run tools. It provides a convenient way to send messages to various LLMs and retrieve structured output.
Last updated: N/A
What is mcpx-py?
mcpx-py is a Python library that simplifies interaction with LLMs through the mcp.run platform. It allows users to send messages to different LLMs and receive responses, with support for structured output using Pydantic models.
How to use mcpx-py?
First, install the library using uv add mcpx-py
or pip install mcpx-py
. Then, obtain an mcp.run session ID using npx --yes -p @dylibso/mcpx gen-session
. Finally, use the Chat
class to send messages to your desired LLM, specifying the model name and optionally a result_type
for structured output.
Key features of mcpx-py
Supports multiple AI providers (Claude, OpenAI, Gemini, Ollama, Llamafile)
Provides a simple interface for sending messages to LLMs
Supports structured output using Pydantic models
Includes command-line interface for interacting with LLMs and tools
Easy installation using uv or pip
Use cases of mcpx-py
Summarizing text from websites or documents
Extracting structured data from text
Building chatbots and conversational AI applications
Integrating LLMs into existing Python projects
Evaluating and comparing different LLMs
FAQ from mcpx-py
How do I get an mcp.run session ID?
How do I get an mcp.run session ID?
Run npx --yes -p @dylibso/mcpx gen-session
to generate a new session ID. You can either write it to a configuration file using the --write
flag or store it in the MPC_RUN_SESSION_ID
environment variable.
What AI providers are supported?
What AI providers are supported?
mcpx-py supports Claude, OpenAI, Gemini, Ollama, and Llamafile. You'll need to set up API keys for the cloud-based providers (Claude, OpenAI, Gemini) as described in the README.
How do I install mcpx-py?
How do I install mcpx-py?
You can install mcpx-py using uv add mcpx-py
or pip install mcpx-py
.
How do I get structured output from the LLM?
How do I get structured output from the LLM?
You can specify a Pydantic model as the result_type
when creating a Chat
instance. The LLM will then attempt to format its response according to the model's schema.
Can I use mcpx-py from the command line?
Can I use mcpx-py from the command line?
Yes, mcpx-py includes a command-line interface called mcpx-client
. You can install it using uv tool install mcpx-py
and then use commands like mcpx-client chat
and mcpx-client tool eval-js
.