MCP Client for Ollama logo

MCP Client for Ollama

by jonigl

A Python client for interacting with Model Context Protocol (MCP) servers using Ollama, allowing local LLMs to use tools. It connects to MCP servers, sends queries to Ollama models, and handles tool calls.

View on GitHub

Last updated: N/A

MCP Client for Ollama

PyPI - Python Version

PyPI - Python Version

PyPI - Python Version

PyPI - Python Version

License: MIT

License: MIT

Build, Publish and Release

Build, Publish and Release

CI

CI

A simple yet powerful Python client for interacting with Model Context Protocol (MCP) servers using Ollama, allowing local LLMs to use tools.

ollmpc usage demo gif

ollmpc usage demo gif

Overview

This project provides a robust Python-based client that connects to one or more Model Context Protocol (MCP) servers and uses Ollama to process queries with tool use capabilities. The client establishes connections to MCP servers, sends queries to Ollama models, and handles the tool calls the model makes.

This implementation was adapted from the Model Context Protocol quickstart guide and customized to work with Ollama, providing a user-friendly interface for interacting with LLMs that support function calling.

Features

  • 🌐 Multi-Server Support: Connect to multiple MCP servers simultaneously
  • 🎨 Rich Terminal Interface: Interactive console UI
  • 🛠️ Tool Management: Enable/disable specific tools or entire servers during chat sessions
  • 🧠 Context Management: Control conversation memory with configurable retention settings
  • 🔄 Cross-Language Support: Seamlessly work with both Python and JavaScript MCP servers
  • 🔍 Auto-Discovery: Automatically find and use Claude's existing MCP server configurations
  • 🚀 Dynamic Model Switching: Switch between any installed Ollama model without restarting
  • 💾 Configuration Persistence: Save and load tool preferences between sessions
  • 📊 Usage Analytics: Track token consumption and conversation history metrics
  • 🔌 Plug-and-Play: Works immediately with standard MCP-compliant tool servers
  • 🔔 Update Notifications: Automatically detects when a new version is available

Requirements

Quick Start

Option 1: Install with pip and run

pip install ollmcp
ollmcp

Option 2: One-step install and run

uvx ollmcp

Option 3: Install from source and run using virtual environment

git clone https://github.com/jonigl/mcp-client-for-ollama.git
cd mcp-client-for-ollama
uv venv && source .venv/bin/activate
uv pip install .
uv run -m mcp_client_for_ollama.client

Usage

Run with default settings:

ollmcp

If you don't provide any options, the client will use auto-discovery mode to find MCP servers from Claude's configuration.

Command-line Arguments

Server Options:
  • --mcp-server: Path to one or more MCP server scripts (.py or .js). Can be specified multiple times.
  • --servers-json: Path to a JSON file with server configurations.
  • --auto-discovery: Auto-discover servers from Claude's default config file (default behavior if no other options provided).
Model Options:
  • --model: Ollama model to use (default: "qwen2.5:7b")

Usage Examples

Connect to a single server:

ollmcp --mcp-server /path/to/weather.py --model llama3.2:3b

Connect to multiple servers:

ollmcp --mcp-server /path/to/weather.py --mcp-server /path/to/filesystem.js --model qwen2.5:latest

Use a JSON configuration file:

ollmcp --servers-json /path/to/servers.json --model llama3.2:1b

Interactive Commands

During chat, use these commands:

ollmcp main interface

ollmcp main interface

| Command | Shortcut | Description | |---------|----------|-------------| | help | h | Display help and available commands | | tools | t | Open the tool selection interface | | model | m | List and select a different Ollama model | | context | c | Toggle context retention (on/off) | | clear | cc | Clear conversation history and context | | context-info | ci | Display context statistics | | cls | clear-screen | Clear the terminal screen | | save-config | sc | Save current tool and model configuration to a file | | load-config | lc | Load tool and model configuration from a file | | reset-config | rc | Reset configuration to defaults (all tools enabled) | | quit | q | Exit the client |

Tool and Server Selection

The tool and server selection interface allows you to enable or disable specific tools:

ollmcp model selection interface

ollmcp model selection interface

  • Enter numbers separated by commas (e.g. 1,3,5) to toggle specific tools
  • Enter ranges of numbers (e.g. 5-8) to toggle multiple consecutive tools
  • Enter S + number (e.g. S1) to toggle all tools in a specific server
  • a or all - Enable all tools
  • n or none - Disable all tools
  • d or desc - Show/hide tool descriptions
  • s or save - Save changes and return to chat
  • q or quit - Cancel changes and return to chat

Model Selection

The model selection interface shows all available models in your Ollama installation:

ollmcp tool and server selection interface

ollmcp tool and server selection interface

  • Enter the number of the model you want to use
  • s or save - Save the model selection and return to chat
  • q or quit - Cancel the model selection and return to chat

Configuration Management

The client supports saving and loading tool configurations between sessions:

  • When using save-config, you can provide a name for the configuration or use the default
  • Configurations are stored in ~/.config/ollmcp/ directory
  • The default configuration is saved as ~/.config/ollmcp/config.json
  • Named configurations are saved as ~/.config/ollmcp/{name}.json

The configuration saves:

  • Current model selection
  • Enabled/disabled status of all tools
  • Context retention settings

Server Configuration Format

The JSON configuration file should follow this format:

{
  "mcpServers": {
    "server-name": {
      "command": "command-to-run",
      "args": ["arg1", "arg2", "..."],
      "env": {
        "ENV_VAR1": "value1",
        "ENV_VAR2": "value2"
      },
      "disabled": false
    }
  }
}

Claude's configuration file is typically located at: ~/Library/Application Support/Claude/claude_desktop_config.json

Compatible Models

The following Ollama models work well with tool use:

  • qwen2.5
  • llama3.3
  • llama3.2
  • llama3.1
  • mistral

For a complete list of Ollama models with tool use capabilities, visit the official Ollama models page.

How Tool Calls Work

  1. The client sends your query to Ollama with a list of available tools
  2. If Ollama decides to use a tool, the client:
    • Extracts the tool name and arguments
    • Calls the appropriate MCP server with these arguments
    • Sends the tool result back to Ollama
    • Shows the final response

Where Can I Find More MCP Servers?

You can explore a collection of MCP servers in the official MCP Servers repository.

This repository contains reference implementations for the Model Context Protocol, community-built servers, and additional resources to enhance your LLM tool capabilities.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments


Made with ❤️ by jonigl