Ollama MCP Chat
by godstale
Ollama MCP Chat is a desktop chatbot application integrating local LLM models with MCP servers, supporting tool calls and extensible features. It provides a GUI based on Python and PySide6, allowing you to extend its capabilities via MCP servers.
View on GitHub
Last updated: N/A
Ollama MCP Chat
Ollama MCP Chat is a desktop chatbot application that integrates Ollama's local LLM models with MCP (Model Context Protocol) servers, supporting various tool calls and extensible features. It provides a GUI based on Python and PySide6, and allows you to freely extend its capabilities via MCP servers.
This project can be very useful as base code for developers who want to create AI applications with GUI in Python.
Key Features
- Run Ollama LLM models locally for free
- Integrate and call various tools via MCP servers
- Manage and save chat history
- Real-time streaming responses and tool call results
- Intuitive desktop GUI (PySide6-based)
- GUI support for adding, editing, and removing MCP servers
System Requirements
- Python 3.12 or higher
- Ollama installed (for local LLM execution)
- uv (recommended for package management)
- MCP server (can be implemented or use external MCP servers)
- smithery.ai (recommended for MCP repository)
Installation
- Clone the repository
git clone https://github.com/your-repo/ollama-mcp-chat.git
cd ollama-mcp-chat
- Install uv (if not installed)
# Using pip
pip install uv
# Or using curl (Unix-like systems)
curl -LsSf https://astral.sh/uv/install.sh | sh
# Or using PowerShell (Windows)
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
- Install dependencies
# Install dependencies
uv sync
- Install Ollama and download a model
- recommend [qwen3:14b](ollama run qwen3:14b)
# Install Ollama (see https://ollama.ai for details)
ollama pull <model-name>
- MCP server configuration (optional)
- Add MCP server information to the
mcp_config.json
file - Example:
{
"mcpServers": {
"weather": {
"command": "python",
"args": ["./mcp_server/mcp_server_weather.py"],
"transport": "stdio"
}
}
}
How to Run
uv run main.py
- The GUI will launch, and you can start chatting and using MCP tools.
Main Files
ui/chat_window.py
: Main GUI window, handles chat/history/settings/server managementagent/chat_history.py
: Manages and saves/loads chat historyworker.py
: Handles asynchronous communication with LLM and MCP serversagent/llm_ollama.py
: Integrates Ollama LLM and MCP tools, handles streaming responsesmcp_server/mcp_manager.py
: Manages and validates MCP server configuration files
Extending MCP Servers
- Add new MCP server information to
mcp_config.json
- Implement and prepare the MCP server executable
- Restart the application and check the MCP server list in the GUI
Chat History
- All conversations are automatically saved to
chat_history.json
- You can load previous chats or start a new chat from the GUI
Exit Commands
- Type
quit
,exit
, orbye
in the program to exit
Notes
- Basic LLM chat works even without MCP server configuration
- Be mindful of your PC's performance and memory usage, especially with large LLM models
- MCP servers can be implemented in Python, Node.js, or other languages, and external MCP servers are also supported
License
MIT License