MCP CLI logo

MCP CLI

by chrishayuk

MCP CLI is a command-line interface for interacting with Model Context Provider servers, enabling seamless communication with LLMs. It integrates with the CHUK-MCP protocol library, supporting tool usage, conversation management, and multiple operational modes.

View on GitHub

Last updated: N/A

What is MCP CLI?

MCP CLI is a powerful command-line tool designed to interact with Model Context Provider (MCP) servers. It provides various operational modes including chat, interactive, and command modes, allowing users to communicate with LLMs, manage conversations, and utilize server-provided tools.

How to use MCP CLI?

To use MCP CLI, first install it from source or using UV. Configure your server settings in server_config.json. Then, use the mcp-cli command followed by the desired mode (chat, interactive, cmd) and relevant options. Refer to the documentation for specific commands and arguments.

Key features of MCP CLI

  • Multiple Operational Modes (Chat, Interactive, Command)

  • Multi-Provider Support (OpenAI, Ollama)

  • Robust Tool System with automatic discovery and execution

  • Advanced Conversation Management with history tracking and export

  • Rich User Experience with command completion and formatted output

Use cases of MCP CLI

  • Conversational interaction with LLMs using chat mode

  • Direct server operation and management using interactive mode

  • Scriptable automation and pipeline integration using command mode

  • Tool execution and management for specific tasks

  • Conversation history analysis and debugging

FAQ from MCP CLI

What is the CHUK-MCP protocol library?

It is a pyodide compatible pure python protocol implementation of MCP, handling the communication layer between the CLI and the server.

How do I configure the server?

Create a server_config.json file with the server connection details, including command, arguments, and environment variables.

How do I use tools in chat mode?

Simply ask a question that requires tool usage, and the LLM will automatically select and call the appropriate tools.

What are the prerequisites for using MCP CLI?

Python 3.11 or higher, a valid API key for OpenAI (if using), a local Ollama installation (if using), a server configuration file, and the CHUK-MCP protocol library.

How do I contribute to the project?

Fork the repository, create a feature branch, commit your changes, push to the branch, and open a pull request.