LMStudio-MCP
by infinitimeless
LMStudio-MCP is a Model Control Protocol (MCP) server that enables Claude to interact with locally running LLM models via LM Studio. It bridges Claude's capabilities with your private, locally hosted models.
Last updated: N/A
What is LMStudio-MCP?
LMStudio-MCP is a server that acts as a bridge between Anthropic's Claude (with MCP capabilities) and a locally running LM Studio instance. It allows Claude to access and utilize models hosted on your local machine through LM Studio's API.
How to use LMStudio-MCP?
- Install and run LM Studio with a model loaded. 2. Install LMStudio-MCP using pip. 3. Configure Claude's MCP settings to connect to the LMStudio-MCP server (either via GitHub or a local installation). 4. Run the LMStudio-MCP server. 5. Connect to the MCP server in Claude when prompted.
Key features of LMStudio-MCP
Health check for LM Studio API
List available models in LM Studio
Get the currently loaded model
Generate completions using local models via Claude
Use cases of LMStudio-MCP
Utilizing private or fine-tuned models with Claude's interface
Offloading LLM inference to local hardware
Experimenting with different models without incurring cloud costs
Combining Claude's reasoning with locally hosted knowledge
FAQ from LMStudio-MCP
What is LM Studio?
What is LM Studio?
LM Studio is a software application that allows you to run Large Language Models (LLMs) locally on your computer.
What is MCP?
What is MCP?
MCP stands for Model Control Protocol. It's a protocol that allows Claude to interact with external services, including LLM servers.
What do I do if Claude reports 404 errors?
What do I do if Claude reports 404 errors?
Ensure LM Studio is running with a model loaded, the server is running on port 1234, and your firewall isn't blocking the connection. Try using '127.0.0.1' instead of 'localhost' if issues persist.
Why is my model not working correctly?
Why is my model not working correctly?
Some models might not fully support the OpenAI chat completions API format. Try different parameter values (temperature, max_tokens) or switch to a more compatible model.
What API endpoints are used?
What API endpoints are used?
The bridge currently uses only the OpenAI-compatible API endpoints of LM Studio.