Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

mcp-crew-ai

MCP.Pizza Chef: adam-paterson

MCP Crew AI Server is a lightweight Python-based server that enables running, managing, and creating CrewAI workflows. It leverages the Model Context Protocol (MCP) to integrate with LLMs and tools like Claude Desktop and Cursor IDE, facilitating seamless orchestration of multi-agent workflows. It supports automatic configuration via YAML files and flexible command line options for custom setups, making it ideal for developers building complex AI-driven workflows with minimal coding.

Use This MCP server To

Run and manage multi-agent AI workflows with ease Automatically load agent and task configurations from YAML files Integrate with LLMs like Claude Desktop and Cursor IDE Orchestrate complex AI workflows via MCP protocol Customize workflow configurations via command line arguments Develop and test CrewAI workflows locally Execute pre-configured workflows using MCP run_workflow tool

README

CrewAI Logo

MCP Crew AI Server

MCP Crew AI Server is a lightweight Python-based server designed to run, manage and create CrewAI workflows. This project leverages the Model Context Protocol (MCP) to communicate with Large Language Models (LLMs) and tools such as Claude Desktop or Cursor IDE, allowing you to orchestrate multi-agent workflows with ease.

Features

  • Automatic Configuration: Automatically loads agent and task configurations from two YAML files (agents.yml and tasks.yml), so you don't need to write custom code for basic setups.
  • Command Line Flexibility: Pass custom paths to your configuration files via command line arguments (--agents and --tasks).
  • Seamless Workflow Execution: Easily run pre-configured workflows through the MCP run_workflow tool.
  • Local Development: Run the server locally in STDIO mode, making it ideal for development and testing.

Installation

There are several ways to install the MCP Crew AI server:

Option 1: Install from PyPI (Recommended)

pip install mcp-crew-ai

Option 2: Install from GitHub

pip install git+https://github.com/adam-paterson/mcp-crew-ai.git

Option 3: Clone and Install

git clone https://github.com/adam-paterson/mcp-crew-ai.git
cd mcp-crew-ai
pip install -e .

Requirements

  • Python 3.11+
  • MCP SDK
  • CrewAI
  • PyYAML

Configuration

  • agents.yml: Define your agents with roles, goals, and backstories.
  • tasks.yml: Define tasks with descriptions, expected outputs, and assign them to agents.

Example agents.yml:

zookeeper:
  role: Zookeeper
  goal: Manage zoo operations
  backstory: >
    You are a seasoned zookeeper with a passion for wildlife conservation...

Example tasks.yml:

write_stories:
  description: >
    Write an engaging zoo update capturing the day's highlights.
  expected_output: 5 engaging stories
  agent: zookeeper
  output_file: zoo_report.md

Usage

Once installed, you can run the MCP CrewAI server using either of these methods:

Standard Python Command

mcp-crew-ai --agents path/to/agents.yml --tasks path/to/tasks.yml

Using UV Execution (uvx)

For a more streamlined experience, you can use the UV execution command:

uvx mcp-crew-ai --agents path/to/agents.yml --tasks path/to/tasks.yml

Or run just the server directly:

uvx mcp-crew-ai-server

This will start the server using default configuration from environment variables.

Command Line Options

  • --agents: Path to the agents YAML file (required)
  • --tasks: Path to the tasks YAML file (required)
  • --topic: The main topic for the crew to work on (default: "Artificial Intelligence")
  • --process: Process type to use (choices: "sequential" or "hierarchical", default: "sequential")
  • --verbose: Enable verbose output
  • --variables: JSON string or path to JSON file with additional variables to replace in YAML files
  • --version: Show version information and exit

Advanced Usage

You can also provide additional variables to be used in your YAML templates:

mcp-crew-ai --agents examples/agents.yml --tasks examples/tasks.yml --topic "Machine Learning" --variables '{"year": 2025, "focus": "deep learning"}'

These variables will replace placeholders in your YAML files. For example, {topic} will be replaced with "Machine Learning" and {year} with "2025".

Contributing

Contributions are welcome! Please open issues or submit pull requests with improvements, bug fixes, or new features.

Licence

This project is licensed under the MIT Licence. See the LICENSE file for details.

Happy workflow orchestration!

mcp-crew-ai FAQ

How do I configure agents and tasks in MCP Crew AI Server?
You configure agents and tasks using two YAML files named agents.yml and tasks.yml, which the server automatically loads at startup.
Can I customize the configuration file paths?
Yes, you can pass custom paths for agents.yml and tasks.yml using the --agents and --tasks command line arguments.
What LLMs and tools does MCP Crew AI Server support?
It supports integration with LLMs and tools such as Claude Desktop, Cursor IDE, and can work with providers like OpenAI, Claude, and Gemini via MCP.
How do I run a workflow using MCP Crew AI Server?
You can run pre-configured workflows easily through the MCP run_workflow tool exposed by the server.
Is MCP Crew AI Server suitable for local development?
Yes, it is designed to be lightweight and supports local development and testing of CrewAI workflows.
Does MCP Crew AI Server require custom coding for basic setups?
No, it automatically loads configurations from YAML files, so no custom code is needed for basic workflow setups.
How does MCP Crew AI Server communicate with LLMs?
It uses the Model Context Protocol (MCP) to communicate securely and efficiently with LLMs and other tools, enabling multi-agent orchestration.
Can I extend MCP Crew AI Server with additional workflows?
Yes, you can create and manage new workflows by adding or modifying the YAML configuration files and running them through the server.