mcp-openai

MCP.Pizza Chef: S1M0N38

mcp-openai is a minimalist MCP client designed to provide an OpenAI compatible API interface for Model Context Protocol (MCP) integration. It serves as a library to build user interfaces for large language models (LLMs) that support MCP, enabling seamless interaction with various locally runnable inference engines such as vLLM, Ollama, Text Generation Inference, llama.cpp, and LMStudio. While it is a simple toy project without planned support, it offers a valuable reference implementation for developers aiming to create MCP clients that leverage OpenAI-style APIs, facilitating interoperability and extensibility in AI application development.

Use This MCP client To

Build LLM user interfaces with OpenAI compatible API Integrate local inference engines via MCP Develop MCP clients for AI applications Prototype MCP client implementations quickly Reference minimal MCP client architecture

README

𝔐  mpc-openai  ✧

MCP Client with OpenAI compatible API

Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications.

https://modelcontextprotocol.io


Warning

This is a simple toy project. Support is not planned. Use as a reference for minimal MCP client development.

This is a MCP client (not a server). It is meant to be used as a library for building LLMs UI that support MCP through an OpenAI compatible API. This opens the door to locally runnable inference engines (vLLM, Ollama, TGI, llama.cpp, LMStudio, ...) that support providing support for the OpenAI API (text generation, function calling, etc.).

Usage

It is highly recommended to use uv in your project based on mpc-openai:

  • It manages python installation and virtual environment.
  • It is an executable that can run self-contained python scripts (in our case MCP server)
  • It is used for CI workflows.

Add mcp-openai to your project dependencies with:

uv add mcp-openai

or use classic pip install.

Create a MCP client

Now you can create a MCP client by specifying your custom configuration.

from mcp_openai import MCPClient
from mcp_openai import config

mcp_client_config = config.MCPClientConfig(
    mcpServers={
        "the-name-of-the-server": config.MCPServerConfig(
            command="uv",
            args=["run", "path/to/server/scripts.py/or/github/raw"],
        )
        # add here other servers ...
    }
)

llm_client_config = config.LLMClientConfig(
    api_key="api-key-for-auth",
    base_url="https://api.openai.com/v1",
)

llm_request_config = config.LLMRequestConfig(model=os.environ["MODEL_NAME"])

client = MCPClient(
    mcp_client_config,
    llm_client_config,
    llm_request_config,
)

Connect and process messages with MCP client

async def main():

    # Establish connection between the client and the server.
    await client.connect_to_server(server_name)

    # messages_in are coming from user interacting with the LLM
    # e.g. UI making use of this MCP client.
    messages_in = ...
    messages_out = await client.process_messages(messages_in)

    # messages_out contains the LLM response. If required, the LLM make use of
    # the available tools offered by the connected servers.

mcp-openai FAQ

Is mcp-openai suitable for production use?
No, mcp-openai is a simple toy project intended as a reference implementation and does not have planned support for production.
Can mcp-openai connect to local inference engines?
Yes, it supports integration with local inference engines like vLLM, Ollama, Text Generation Inference, llama.cpp, and LMStudio.
Does mcp-openai provide a full-featured MCP client?
It provides a minimal MCP client implementation primarily for learning and prototyping purposes.
How does mcp-openai handle API compatibility?
It offers an OpenAI compatible API interface to facilitate easy integration with existing OpenAI-based tools and workflows.
Can I extend mcp-openai for custom MCP client needs?
Yes, since it is open source and minimal, developers can extend it to fit specific MCP client requirements.
Is there official support or maintenance for mcp-openai?
No official support or maintenance is planned; it is provided as a community reference.
Where can I find more information about MCP?
Visit the official Model Context Protocol website at https://modelcontextprotocol.io for detailed documentation and resources.