OmniLLM
by sabpap
OmniLLM is an MCP server that allows Claude to query and integrate responses from other large language models (LLMs) like ChatGPT, Azure OpenAI, and Google Gemini. It creates a unified access point for all your AI needs.
Last updated: N/A
What is OmniLLM?
OmniLLM is a Multi-LLM Proxy (MCP) server designed to integrate Claude with other LLMs, such as ChatGPT, Azure OpenAI, and Google Gemini. It enables Claude to query these external LLMs and incorporate their responses, providing a more comprehensive and versatile AI experience.
How to use OmniLLM?
To use OmniLLM, you need to install it, configure it with API keys for the desired LLMs, and integrate it with Claude Desktop by modifying the claude_desktop_config.json
file. Once configured, you can use specific phrases within Claude to trigger queries to the connected LLMs.
Key features of OmniLLM
Query OpenAI's ChatGPT models
Query Azure OpenAI services
Query Google's Gemini models
Get responses from all LLMs for comparison
Check which LLM services are configured and available
Provides tools to query specific LLMs or all available LLMs
Easy integration with Claude Desktop
Use cases of OmniLLM
Comparing responses from different LLMs for a given prompt
Leveraging the strengths of different LLMs for specific tasks
Augmenting Claude's knowledge with information from other sources
Creating a more comprehensive and nuanced AI response
FAQ from OmniLLM
What are the prerequisites for using OmniLLM?
What are the prerequisites for using OmniLLM?
You need Python 3.10+, Claude Desktop application, and API keys for the LLMs you want to use (OpenAI, Azure OpenAI, Google Gemini).
How do I install OmniLLM?
How do I install OmniLLM?
Clone the repository, create a virtual environment, activate it, and install the required dependencies using pip install mcp[cli] httpx python-dotenv
.
How do I configure OmniLLM?
How do I configure OmniLLM?
Create a .env
file in the project root with your API keys for the LLMs you want to use. You only need to add the keys for the services you want to use.
How do I integrate OmniLLM with Claude Desktop?
How do I integrate OmniLLM with Claude Desktop?
Navigate to Settings > Developer > Edit Config in Claude Desktop and add the OmniLLM server configuration to your claude_desktop_config.json
file, specifying the path to your server.py
file.
What if I encounter issues?
What if I encounter issues?
Check that your API keys are correctly set in the .env
file, ensure Claude Desktop is properly configured with the server path, verify that all dependencies are installed in your virtual environment, and check Claude's logs for any connection or execution errors.