Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

unichat-mcp-server

MCP.Pizza Chef: amidabuddha

The unichat-mcp-server is a Python-based MCP server that facilitates sending AI model requests across multiple providers including OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, and Inception. It acts as a unified interface using the MCP protocol, allowing clients to interact with diverse LLM APIs through a single tool called 'unichat'. This server requires vendor API keys and supports sending structured message requests, returning AI-generated responses seamlessly. It is also available in a TypeScript version, making it versatile for different development environments. The server is open source under the MIT license and integrates easily into MCP ecosystems for multi-provider AI workflows.

Use This MCP server To

Send AI requests to multiple LLM providers via one MCP server Unify API calls to OpenAI, Anthropic, Google AI, and others Integrate multi-vendor AI models into MCP-based applications Route structured chat messages through a single MCP tool Enable vendor key-based secure AI model access Use Python or TypeScript versions for flexible deployment

README

MseeP.ai Security Assessment Badge

Unichat MCP Server in Python

Also available in TypeScript

Send requests to OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, Inception using MCP protocol via tool or predefined prompts. Vendor API key required

Tools

The server implements one tool:

  • unichat: Send a request to unichat
    • Takes "messages" as required string arguments
    • Returns a response

Prompts

  • code_review
    • Review code for best practices, potential issues, and improvements
    • Arguments:
      • code (string, required): The code to review"
  • document_code
    • Generate documentation for code including docstrings and comments
    • Arguments:
      • code (string, required): The code to comment"
  • explain_code
    • Explain how a piece of code works in detail
    • Arguments:
      • code (string, required): The code to explain"
  • code_rework
    • Apply requested changes to the provided code
    • Arguments:
      • changes (string, optional): The changes to apply"
      • code (string, required): The code to rework"

Quickstart

Install

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Supported Models:

A list of currently supported models to be used as "SELECTED_UNICHAT_MODEL" may be found here. Please make sure to add the relevant vendor API key as "YOUR_UNICHAT_API_KEY"

Example:

"env": {
  "UNICHAT_MODEL": "gpt-4o-mini",
  "UNICHAT_API_KEY": "YOUR_OPENAI_API_KEY"
}

Development/Unpublished Servers Configuration

"mcpServers": {
  "unichat-mcp-server": {
    "command": "uv",
    "args": [
      "--directory",
      "{{your source code local directory}}/unichat-mcp-server",
      "run",
      "unichat-mcp-server"
    ],
    "env": {
      "UNICHAT_MODEL": "SELECTED_UNICHAT_MODEL",
      "UNICHAT_API_KEY": "YOUR_UNICHAT_API_KEY"
    }
  }
}

Published Servers Configuration

"mcpServers": {
  "unichat-mcp-server": {
    "command": "uvx",
    "args": [
      "unichat-mcp-server"
    ],
    "env": {
      "UNICHAT_MODEL": "SELECTED_UNICHAT_MODEL",
      "UNICHAT_API_KEY": "YOUR_UNICHAT_API_KEY"
    }
  }
}

Installing via Smithery

To install Unichat for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install unichat-mcp-server --client claude

Development

Building and Publishing

To prepare the package for distribution:

  1. Remove older builds:
rm -rf dist
  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish --token {{YOUR_PYPI_API_TOKEN}}

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory {{your source code local directory}}/unichat-mcp-server run unichat-mcp-server

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

unichat-mcp-server FAQ

How do I authenticate requests with unichat-mcp-server?
You must provide valid vendor API keys for each AI provider you want to access through the server.
Can I use unichat-mcp-server with different programming languages?
Yes, it is available in Python and TypeScript versions to suit various development environments.
What kind of requests does the unichat tool accept?
The 'unichat' tool accepts 'messages' as required string arguments to send chat-based requests to AI models.
Is the unichat-mcp-server open source?
Yes, it is released under the MIT license, allowing free use and modification.
Does unichat-mcp-server support real-time interaction with multiple AI providers?
Yes, it routes requests to multiple providers like OpenAI, Anthropic, Google AI, and others in real time via MCP.
How do I install or deploy unichat-mcp-server?
You can find installation instructions and source code on its GitHub repository, with badges indicating security assessments and usage stats.
What is the main tool implemented by unichat-mcp-server?
The server implements a single tool named 'unichat' for sending requests and receiving responses from AI models.
Can unichat-mcp-server handle vendor-specific API differences?
Yes, it abstracts vendor API details behind the MCP protocol, providing a unified interface.