Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

LMStudio-MCP

MCP.Pizza Chef: infinitimeless

LMStudio-MCP is a Model Control Protocol server that enables Claude to communicate with locally running LLM models via LM Studio. It provides functionalities such as health checks, model listing, current model retrieval, and text generation using local models. This server bridges Claude's advanced interface with private, locally hosted models, allowing secure and efficient use of local AI resources within Claude's environment.

Use This MCP server To

Bridge Claude with local LM Studio LLM models Check health status of LM Studio API List all available local models in LM Studio Retrieve the currently loaded model in LM Studio Generate text completions using local LLM models via Claude Combine Claude's interface with private local models Enable secure local model usage without cloud dependency

README

LMStudio-MCP

A Model Control Protocol (MCP) server that allows Claude to communicate with locally running LLM models via LM Studio.

Screenshot 2025-03-22 at 16 50 53

Overview

LMStudio-MCP creates a bridge between Claude (with MCP capabilities) and your locally running LM Studio instance. This allows Claude to:

  • Check the health of your LM Studio API
  • List available models
  • Get the currently loaded model
  • Generate completions using your local models

This enables you to leverage your own locally running models through Claude's interface, combining Claude's capabilities with your private models.

Prerequisites

  • Python 3.7+
  • LM Studio installed and running locally with a model loaded
  • Claude with MCP access
  • Required Python packages (see Installation)

Installation

  1. Clone this repository:

    git clone https://github.com/infinitimeless/LMStudio-MCP.git
    cd LMStudio-MCP
  2. Install the required packages:

    pip install requests "mcp[cli]" openai

MCP Configuration

For Claude to connect to this bridge, you need to configure the MCP settings properly. You can either:

  1. Use directly from GitHub:

    {
      "lmstudio-mcp": {
        "command": "uvx",
        "args": [
          "https://github.com/infinitimeless/LMStudio-MCP"
        ]
      }
    }
  2. Use local installation:

    {
      "lmstudio-mcp": {
        "command": "/bin/bash",
        "args": [
          "-c",
          "cd /path/to/LMStudio-MCP && source venv/bin/activate && python lmstudio_bridge.py"
        ]
      }
    }

For detailed MCP configuration instructions, see MCP_CONFIGURATION.md.

Usage

  1. Start your LM Studio application and ensure it's running on port 1234 (the default)

  2. Load a model in LM Studio

  3. If running locally (not using uvx), run the LMStudio-MCP server:

    python lmstudio_bridge.py
  4. In Claude, connect to the MCP server when prompted by selecting "lmstudio-mcp"

Available Functions

The bridge provides the following functions:

  • health_check(): Verify if LM Studio API is accessible
  • list_models(): Get a list of all available models in LM Studio
  • get_current_model(): Identify which model is currently loaded
  • chat_completion(prompt, system_prompt, temperature, max_tokens): Generate text from your local model

Known Limitations

  • Some models (e.g., phi-3.5-mini-instruct_uncensored) may have compatibility issues
  • The bridge currently uses only the OpenAI-compatible API endpoints of LM Studio
  • Model responses will be limited by the capabilities of your locally loaded model

Troubleshooting

API Connection Issues

If Claude reports 404 errors when trying to connect to LM Studio:

  • Ensure LM Studio is running and has a model loaded
  • Check that LM Studio's server is running on port 1234
  • Verify your firewall isn't blocking the connection
  • Try using "127.0.0.1" instead of "localhost" in the API URL if issues persist

Model Compatibility

If certain models don't work correctly:

  • Some models might not fully support the OpenAI chat completions API format
  • Try different parameter values (temperature, max_tokens) for problematic models
  • Consider switching to a more compatible model if problems persist

For more detailed troubleshooting help, see TROUBLESHOOTING.md.

License

MIT

Acknowledgements

This project was originally developed as "Claude-LMStudio-Bridge_V2" and has been renamed and open-sourced as "LMStudio-MCP".

LMStudio-MCP FAQ

How do I install LMStudio-MCP?
Clone the repository, ensure Python 3.7+ is installed, and follow the setup instructions in the README to install dependencies and configure LM Studio locally.
What prerequisites are needed to run LMStudio-MCP?
You need Python 3.7 or higher, LM Studio installed and running locally with a model loaded, and Claude with MCP access.
Can LMStudio-MCP work with models not hosted in LM Studio?
No, LMStudio-MCP specifically interfaces with models running locally via LM Studio only.
How does LMStudio-MCP enhance privacy?
By enabling local model usage, LMStudio-MCP keeps data and inference on your machine, avoiding cloud transmission.
What functionalities does LMStudio-MCP provide to Claude?
It allows Claude to check LM Studio API health, list models, get the current model, and generate completions using local models.
Is LMStudio-MCP compatible with other LLM providers?
LMStudio-MCP is designed to work with LM Studio local models but can be used alongside Claude, which supports OpenAI, Anthropic Claude, and Google Gemini models.
How do I verify LMStudio-MCP is running correctly?
After setup, you can test API health checks and model listing endpoints to confirm connectivity between Claude and LM Studio.
Can LMStudio-MCP be used in production environments?
Yes, it is suitable for secure local deployments where private model inference is required without cloud dependency.