MCPHost
by mark3labs
MCPHost is a CLI host application that enables Large Language Models (LLMs) to interact with external tools through the Model Context Protocol (MCP). It currently supports Claude 3.5 Sonnet and Ollama models, along with Google Gemini and OpenAI-compatible models.
Last updated: N/A
What is MCPHost?
MCPHost acts as a host in the MCP client-server architecture, allowing LLMs to access external tools and data sources, maintain consistent context, and execute commands safely. It provides a command-line interface for interacting with various AI models through MCP servers.
How to use MCPHost?
Install MCPHost using go install github.com/mark3labs/mcphost@latest
. Configure MCP servers in the ~/.mcp.json
file or specify a custom location using the --config
flag. Use the CLI with flags to select models (e.g., mcphost -m ollama:qwen2.5:3b
) and interact with tools. Utilize interactive commands like /help
, /tools
, /servers
, /history
, and /quit
during conversations.
Key features of MCPHost
Interactive conversations with support models
Support for multiple concurrent MCP servers
Dynamic tool discovery and integration
Tool calling capabilities for both model types
Configurable MCP server locations and arguments
Consistent command interface across model types
Configurable message history window for context management
Use cases of MCPHost
Interacting with external databases through LLMs
Automating tasks using LLMs and external tools
Building custom LLM-powered applications
Integrating LLMs with existing systems via MCP servers
FAQ from MCPHost
What is MCP?
What is MCP?
MCP stands for Model Context Protocol, a standard for LLMs to interact with external tools.
What models are supported?
What models are supported?
Currently supports Claude 3.5 Sonnet, Ollama-compatible models, Google Gemini models, and OpenAI-compatible models.
How do I configure MCP servers?
How do I configure MCP servers?
MCP servers are configured in the ~/.mcp.json
file, specifying the command and arguments for each server.
How do I specify which model to use?
How do I specify which model to use?
Use the -m
or --model
flag followed by the model name in the format provider:model
(e.g., ollama:qwen2.5:3b
).
Where can I find MCP-compatible servers?
Where can I find MCP-compatible servers?
See the MCP Servers Repository at https://github.com/modelcontextprotocol/servers for examples and reference implementations.