MCP LLM Bridge
by bartolli
The MCP LLM Bridge connects Model Context Protocol (MCP) servers to OpenAI-compatible LLMs, providing a bidirectional protocol translation layer. It enables any OpenAI-compatible language model to leverage MCP-compliant tools through a standardized interface.
Last updated: N/A
What is MCP LLM Bridge?
The MCP LLM Bridge is a tool that translates between the Model Context Protocol (MCP) and the OpenAI API (or other OpenAI-compatible APIs). It allows language models that support OpenAI's function calling to interact with MCP-compliant tools.
How to use MCP LLM Bridge?
- Install the bridge using the provided commands (uv, git clone, uv venv, etc.). 2. Configure the bridge with your OpenAI API key and model, or configure it to use a local endpoint like Ollama or LM Studio. 3. Run the main script. 4. Interact with the bridge by providing prompts.
Key features of MCP LLM Bridge
Bidirectional protocol translation between MCP and OpenAI API
Support for OpenAI function calling
Compatibility with local LLM endpoints (Ollama, LM Studio)
Standardized interface for MCP tool interaction
Configuration via environment variables and Python code
Use cases of MCP LLM Bridge
Connecting OpenAI models to MCP-compliant tools
Using local LLMs with MCP servers
Standardizing tool access for language models
Integrating language models with existing MCP infrastructure
FAQ from MCP LLM Bridge
What is MCP?
What is MCP?
MCP stands for Model Context Protocol. It is a protocol for interacting with language models.
Which LLMs are supported?
Which LLMs are supported?
The bridge primarily supports the OpenAI API, but it also works with any endpoint implementing the OpenAI API specification, such as Ollama and LM Studio.
How do I configure the bridge for Ollama?
How do I configure the bridge for Ollama?
You can configure the bridge for Ollama by setting the api_key
to "not-needed", the model
to your desired Ollama model, and the base_url
to "http://localhost:11434/v1".
How do I run the tests?
How do I run the tests?
Install the package with test dependencies using uv pip install -e ".[test]"
and then run the tests using python -m pytest -v tests/
.
Where can I find the configuration?
Where can I find the configuration?
The configuration is located in src/mcp_llm_bridge/main.py
. You can modify the BridgeConfig
object to customize the bridge's behavior.