LMStudio Claude Bridge
STDIOMCP server allowing Claude to communicate with local LLM models via LM Studio.
MCP server allowing Claude to communicate with local LLM models via LM Studio.
A Model Control Protocol (MCP) server that allows Claude to communicate with locally running LLM models via LM Studio.
LMStudio-MCP creates a bridge between Claude (with MCP capabilities) and your locally running LM Studio instance. This allows Claude to:
This enables you to leverage your own locally running models through Claude's interface, combining Claude's capabilities with your private models.
curl -fsSL https://raw.githubusercontent.com/infinitimeless/LMStudio-MCP/main/install.sh | bash
git clone https://github.com/infinitimeless/LMStudio-MCP.git cd LMStudio-MCP pip install requests "mcp[cli]" openai
# Using pre-built image docker run -it --network host ghcr.io/infinitimeless/lmstudio-mcp:latest # Or build locally git clone https://github.com/infinitimeless/LMStudio-MCP.git cd LMStudio-MCP docker build -t lmstudio-mcp . docker run -it --network host lmstudio-mcp
git clone https://github.com/infinitimeless/LMStudio-MCP.git cd LMStudio-MCP docker-compose up -d
For detailed deployment instructions, see DOCKER.md.
Using GitHub directly (simplest):
{ "lmstudio-mcp": { "command": "uvx", "args": [ "https://github.com/infinitimeless/LMStudio-MCP" ] } }
Using local installation:
{ "lmstudio-mcp": { "command": "/bin/bash", "args": [ "-c", "cd /path/to/LMStudio-MCP && source venv/bin/activate && python lmstudio_bridge.py" ] } }
Using Docker:
{ "lmstudio-mcp-docker": { "command": "docker", "args": [ "run", "-i", "--rm", "--network=host", "ghcr.io/infinitimeless/lmstudio-mcp:latest" ] } }
For complete MCP configuration instructions, see MCP_CONFIGURATION.md.
The bridge provides the following functions:
health_check()
: Verify if LM Studio API is accessiblelist_models()
: Get a list of all available models in LM Studioget_current_model()
: Identify which model is currently loadedchat_completion(prompt, system_prompt, temperature, max_tokens)
: Generate text from your local modelThis project supports multiple deployment methods:
Method | Use Case | Pros | Cons |
---|---|---|---|
Local Python | Development, simple setup | Fast, direct control | Requires Python setup |
Docker | Isolated environments | Clean, portable | Requires Docker |
Docker Compose | Production deployments | Easy management | More complex setup |
Kubernetes | Enterprise/scale | Highly scalable | Complex configuration |
GitHub Direct | Zero setup | No local install needed | Requires internet |
If Claude reports 404 errors when trying to connect to LM Studio:
If certain models don't work correctly:
For detailed troubleshooting help, see TROUBLESHOOTING.md.
This project includes comprehensive Docker support:
See DOCKER.md for complete containerization documentation.
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
MIT
This project was originally developed as "Claude-LMStudio-Bridge_V2" and has been renamed and open-sourced as "LMStudio-MCP".
🌟 If this project helps you, please consider giving it a star!