Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

MaxMSP-MCP-Server

MCP.Pizza Chef: tiianhk

MaxMSP-MCP-Server is an MCP server that integrates the Model Context Protocol with Max/MSP/Jitter, allowing large language models to directly interpret, explain, and generate Max patches. It provides LLMs access to patch objects, subpatches, and official documentation, enabling advanced tasks like patch debugging, synthesis generation, and contextual understanding within the Max environment.

Use This MCP server To

Explain Max/MSP patches with detailed LLM-generated commentary Generate new Max patches such as FM synthesizers via LLMs Debug Max patches by leveraging LLM understanding of patch objects Retrieve and explain Max object documentation in real time Interact with subpatch windows for comprehensive patch analysis Automate Max patch creation workflows using natural language prompts

README

MaxMSP-MCP Server

This project uses the Model Context Protocol (MCP) to let LLMs directly understand and generate Max patches.

Understand: LLM Explaining a Max Patch

img Video link. Acknowledgement: the patch being explained is downloaded from here. Text comments in the original file are deleted.

Generate: LLM Making an FM Synth

img Check out the full video where you can listen to the synthesised sounds.

The LLM agent has access to the official documentation of each object, as well as objects in the current patch and subpatch windows, which helps in retrieving and explaining objects, debugging, and verifying their own actions.

Installation

Prerequisites

  • Python 3.8 or newer
  • uv package manager
  • Max 9 or newer (because some of the scripts require the Javascript V8 engine), we have not tested it on Max 8 or earlier versions of Max yet.

Installing the MCP server

  1. Install uv:
# On macOS and Linux:
curl -LsSf https://astral.sh/uv/install.sh | sh
# On Windows:
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
  1. Clone this repository and open its directory:
git clone https://github.com/tiianhk/MaxMSP-MCP-Server.git
cd MaxMSP-MCP-Server
  1. Start a new environment and install python dependencies:
uv venv
uv pip install -r requirements.txt
  1. Connect the MCP server to a MCP client (which hosts LLMs):
# Claude:
python install.py --client claude
# or Cursor:
python install.py --client cursor

To use other clients (check the list), you need to download, mannually add the configuration file path to here, and connect by running python install.py --client {your_client_name}.

Installing to a Max patch

Use or copy from MaxMSP_Agent/demo.maxpat. In the first tab, click the script npm version message to verify that npm is installed. Then click script npm install to install the required dependencies. Switch to the second tab to access the agent. Click script start to initiate communication with Python. Once connected, you can interact with the LLM interface to have it explain, modify, or create Max objects within the patch.

Disclaimer

This is a third party implementation and not made by Cycling '74.

MaxMSP-MCP-Server FAQ

How do I install the MaxMSP-MCP-Server?
Follow the installation instructions in the GitHub repository, ensuring Max/MSP is installed and prerequisites are met.
Can the server access documentation for Max objects?
Yes, it provides LLMs with access to official Max object documentation for accurate explanations and generation.
Does this server support debugging of Max patches?
Yes, LLMs can analyze and help debug patches by understanding their structure and components.
Is it possible to generate new Max patches using this server?
Absolutely, LLMs can create new patches such as synthesizers based on user prompts.
What LLM providers are compatible with this server?
It is compatible with multiple LLM providers including OpenAI, Anthropic Claude, and Google Gemini.
Can the server handle subpatches within a Max patch?
Yes, it can access and interpret objects in subpatch windows for comprehensive context.
How does the server improve Max patch workflows?
By enabling natural language interaction, it automates patch explanation, generation, and debugging tasks.
Is real-time interaction with Max patches supported?
Yes, the server facilitates real-time understanding and generation of Max patches by LLMs.