Claude-LMStudio Bridge logo

Claude-LMStudio Bridge

by infinitimeless

The Claude-LMStudio Bridge is an MCP server that enables Claude to interact with local LLMs running in LM Studio. It provides Claude with the ability to list models, generate text, and handle chat completions using your local LLMs.

View on GitHub

Last updated: N/A

What is Claude-LMStudio Bridge?

The Claude-LMStudio Bridge is a server that connects Claude Desktop with locally running LLMs in LM Studio. It acts as a mediator, allowing Claude to leverage the processing power and models available within LM Studio.

How to use Claude-LMStudio Bridge?

First, install Claude Desktop and LM Studio. Then, clone the repository and run the setup script (setup.sh or setup.bat). Follow the script's instructions to configure Claude Desktop with the bridge as an MCP server. After setup, you can use natural language commands in Claude to interact with your local LLMs.

Key features of Claude-LMStudio Bridge

  • Access to list available models in LM Studio

  • Ability to generate text using local LLMs

  • Support for chat completions through local models

  • Health check tool to verify connectivity with LM Studio

Use cases of Claude-LMStudio Bridge

  • Using Claude to generate text with a specific local LLM

  • Chatting with a local LLM through Claude

  • Experimenting with different local LLMs within the Claude environment

  • Offloading computationally intensive tasks to a local machine

FAQ from Claude-LMStudio Bridge

How do I check if the bridge is working?

Use the health check tool by asking Claude 'Can you check if my LM Studio server is running?'

What if Claude can't find the bridge?

Check the Claude Desktop configuration and ensure the path to run_server.sh or run_server.bat is correct and absolute. Also, verify the server script is executable (chmod +x run_server.sh on macOS/Linux).

How do I list available models?

Ask Claude 'List the available models in my local LM Studio'

What if I get a 'Cannot connect to LM Studio API' error?

Make sure LM Studio is running and the API server is enabled in LM Studio's settings. Also, check that the port (default: 1234) matches what's in your .env file.

Can I customize the bridge's behavior?

Yes, you can create a .env file to customize settings like the LM Studio host, port, and debug mode.