wecom-bot-mcp-server

MCP.Pizza Chef: loonghao

The wecom-bot-mcp-server is a Python-based server implementation that adheres to the Model Context Protocol (MCP) for WeCom (WeChat Work) bots. It provides a standardized interface to facilitate automated messaging and context-aware interactions within enterprise WeChat environments. This server enables seamless integration of intelligent agents with WeCom, allowing real-time context sharing and interaction management. It supports secure, scoped, and observable communication, making it ideal for enterprise automation, customer support, and workflow enhancement within WeChat Work ecosystems.

Use This MCP server To

Automate enterprise messaging in WeChat Work Enable context-aware bot interactions in WeCom Integrate AI agents with WeChat Work environments Manage real-time communication for WeCom bots Enhance customer support via automated WeCom bots Streamline workflow automation in enterprise chat Provide secure, observable bot messaging in WeCom

README

MseeP.ai Security Assessment Badge

WeCom Bot MCP Server

WeCom Bot Logo

A Model Context Protocol (MCP) compliant server implementation for WeCom (WeChat Work) bot.

PyPI version Python Version codecov Code style: ruff smithery badge

English | 中文

WeCom Bot Server MCP server

Features

  • Support for multiple message types:
    • Text messages
    • Markdown messages
    • Image messages (base64)
    • File messages
  • @mention support (via user ID or phone number)
  • Message history tracking
  • Configurable logging system
  • Full type annotations
  • Pydantic-based data validation

Requirements

  • Python 3.10+
  • WeCom Bot Webhook URL (obtained from WeCom group settings)

Installation

There are several ways to install WeCom Bot MCP Server:

1. Automated Installation (Recommended)

Using Smithery (For Claude Desktop):

npx -y @smithery/cli install wecom-bot-mcp-server --client claude

Using VSCode with Cline Extension:

  1. Install Cline Extension from VSCode marketplace
  2. Open Command Palette (Ctrl+Shift+P / Cmd+Shift+P)
  3. Search for "Cline: Install Package"
  4. Type "wecom-bot-mcp-server" and press Enter

2. Manual Installation

Install from PyPI:

pip install wecom-bot-mcp-server

Configure MCP manually:

Create or update your MCP configuration file:

// For Windsurf: ~/.windsurf/config.json
{
  "mcpServers": {
    "wecom": {
      "command": "uvx",
      "args": [
        "wecom-bot-mcp-server"
      ],
      "env": {
        "WECOM_WEBHOOK_URL": "your-webhook-url"
      }
    }
  }
}

Configuration

Setting Environment Variables

# Windows PowerShell
$env:WECOM_WEBHOOK_URL = "your-webhook-url"

# Optional configurations
$env:MCP_LOG_LEVEL = "DEBUG"  # Log levels: DEBUG, INFO, WARNING, ERROR, CRITICAL
$env:MCP_LOG_FILE = "path/to/custom/log/file.log"  # Custom log file path

Log Management

The logging system uses platformdirs.user_log_dir() for cross-platform log file management:

  • Windows: C:\Users\<username>\AppData\Local\hal\wecom-bot-mcp-server
  • Linux: ~/.local/share/hal/wecom-bot-mcp-server
  • macOS: ~/Library/Application Support/hal/wecom-bot-mcp-server

The log file is named mcp_wecom.log and is stored in the above directory.

Usage

Starting the Server

wecom-bot-mcp-server

Usage Examples (With MCP)

# Scenario 1: Send weather information to WeCom
USER: "How's the weather in Shenzhen today? Send it to WeCom"
ASSISTANT: "I'll check Shenzhen's weather and send it to WeCom"

await mcp.send_message(
    content="Shenzhen Weather:\n- Temperature: 25°C\n- Weather: Sunny\n- Air Quality: Good",
    msg_type="markdown"
)

# Scenario 2: Send meeting reminder and @mention relevant people
USER: "Send a reminder for the 3 PM project review meeting, remind Zhang San and Li Si to attend"
ASSISTANT: "I'll send the meeting reminder"

await mcp.send_message(
    content="## Project Review Meeting Reminder\n\nTime: Today 3:00 PM\nLocation: Meeting Room A\n\nPlease be on time!",
    msg_type="markdown",
    mentioned_list=["zhangsan", "lisi"]
)

# Scenario 3: Send a file
USER: "Send this weekly report to the WeCom group"
ASSISTANT: "I'll send the weekly report"

await mcp.send_message(
    content=Path("weekly_report.docx"),
    msg_type="file"
)

Direct API Usage

Send Messages

from wecom_bot_mcp_server import mcp

# Send markdown message
await mcp.send_message(
    content="**Hello World!**", 
    msg_type="markdown"
)

# Send text message and mention users
await mcp.send_message(
    content="Hello @user1 @user2",
    msg_type="text",
    mentioned_list=["user1", "user2"]
)

Send Files

from wecom_bot_mcp_server import send_wecom_file

# Send file
await send_wecom_file("/path/to/file.txt")

Send Images

from wecom_bot_mcp_server import send_wecom_image

# Send local image
await send_wecom_image("/path/to/image.png")

# Send URL image
await send_wecom_image("https://example.com/image.png")

Development

Setup Development Environment

  1. Clone the repository:
git clone https://github.com/loonghao/wecom-bot-mcp-server.git
cd wecom-bot-mcp-server
  1. Create a virtual environment and install dependencies:
# Using uv (recommended)
pip install uv
uv venv
uv pip install -e ".[dev]"

# Or using traditional method
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -e ".[dev]"

Testing

# Using uv (recommended)
uvx nox -s pytest

# Or using traditional method
nox -s pytest

Code Style

# Check code
uvx nox -s lint

# Automatically fix code style issues
uvx nox -s lint_fix

Building and Publishing

# Build the package
uv build

# Build and publish to PyPI
uv build && twine upload dist/*

Project Structure

wecom-bot-mcp-server/
├── src/
│   └── wecom_bot_mcp_server/
│       ├── __init__.py
│       ├── server.py
│       ├── message.py
│       ├── file.py
│       ├── image.py
│       ├── utils.py
│       └── errors.py
├── tests/
│   ├── test_server.py
│   ├── test_message.py
│   ├── test_file.py
│   └── test_image.py
├── docs/
├── pyproject.toml
├── noxfile.py
└── README.md

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

wecom-bot-mcp-server FAQ

How do I install the wecom-bot-mcp-server?
You can install it via PyPI using 'pip install wecom-bot-mcp-server'.
Does the server support secure communication?
Yes, it supports secure, scoped, and observable interactions compliant with MCP standards.
Can this server integrate with multiple LLM providers?
Yes, it is designed to work with various LLMs including OpenAI, Anthropic Claude, and Google Gemini.
What Python versions are supported?
The server supports multiple Python versions as indicated on its PyPI page, typically Python 3.7 and above.
How does the server handle real-time context updates?
It uses MCP protocols to provide real-time context sharing and interaction management for WeCom bots.
Is there documentation available for developers?
Yes, the GitHub repository includes detailed documentation and usage examples.
Can the server be customized for specific enterprise workflows?
Yes, it is designed to be extensible and customizable to fit various enterprise automation needs.
How is the server tested for reliability?
The project includes code coverage and style badges indicating continuous integration and quality assurance.