ollama-mcp

MCP.Pizza Chef: Shlysz

Ollama-MCP is a Python-based MCP client that connects Ollama large language models with MCP servers, enabling seamless interaction between models and external tools. It supports asynchronous SSE communication, an extensible tool calling system, and offers an interactive CLI. The client simplifies tool format conversions and is designed for easy configuration and use with Ollama models like qwen3:14b.

Use This MCP client To

Connect Ollama LLMs to MCP servers for real-time context sharing Enable asynchronous SSE communication with MCP servers Invoke and manage external tools via MCP protocol Use interactive CLI for model and tool interaction Convert tool formats between Spring AI and Ollama standards Configure and switch between multiple MCP servers easily Integrate Ollama models into AI-enhanced workflows

README

Ollama MCP Client

一个基于Python的客户端实现,用于连接Ollama大语言模型和MCP(Model Context Protocol)服务器,实现模型与外部工具的无缝交互。

特性

  • 集成Ollama大语言模型(默认使用qwen3:14b)
  • 支持SSE(Server-Sent Events)异步通信
  • 可扩展的工具调用系统
  • 交互式命令行界面
  • 简洁的工具格式转换(Spring AI的 tools_list格式 ⟷ Ollama格式)

快速开始

环境要求

  • Python 3.12+
  • Ollama CLI(安装方法请参考 Ollama官网
  • java 21+

安装

git clone https://github.com/Shlysz/ollama-mcp.git
cd Ollama-mcp-client
pip install -r requirements.txt

配置

  1. client/mcp_server_config.json 配置我的MCP服务器,我的mcp服务器发布在release中,格式如下:
{
  "mcpServers": {
    "server-name": {
      "url": "http://localhost:8080/sse"
    }
  }
}

如果你有自己的mcp-server 也可以配置自己的服务器地址。

  1. client/Constants.py 配置Ollama模型:
LLAMA_MODEL_QWEN = "qwen3:14b"  # 或其他Ollama支持的模型

运行

python client/main.py

项目结构

Ollama-MCP-Client
├── client
│   ├── Constants.py
│   ├── McpClient.py
│   ├── OllamaAgent.py
│   ├── OllamaTools.py
│   ├── __init__.py
│   ├── main.py
│   ├── mcp_server_config.json
│   └── utils
│       ├── JsonUtil.py
├── requirements.txt
└── test
    ├── __init__.py
    └── test_ollamaToolsformat.py

致谢

本项目受到 mihirrd/ollama-mcp-client 的启发,它的项目只支持本地不支持sse,本人的相反,特此感谢。

许可证

MIT License

ollama-mcp FAQ

How do I install the Ollama-MCP client?
Clone the GitHub repo, then run 'pip install -r requirements.txt' with Python 3.12+ installed.
What are the environment requirements for running Ollama-MCP client?
Requires Python 3.12+, Ollama CLI installed, and Java 21+.
How do I configure the MCP server for Ollama-MCP client?
Edit 'client/mcp_server_config.json' to add your MCP server URL in the specified JSON format.
Can I use Ollama-MCP client with custom MCP servers?
Yes, you can configure any MCP server URL in the config file to connect.
Which Ollama models are supported by the client?
The client supports Ollama models like 'qwen3:14b' and others supported by Ollama.
How does Ollama-MCP handle tool format conversions?
It converts between Spring AI's tools_list format and Ollama's tool format for compatibility.
Is there an interactive interface for the Ollama-MCP client?
Yes, it provides an interactive command line interface for user interaction.
How does the client communicate with MCP servers?
It uses Server-Sent Events (SSE) for asynchronous communication with MCP servers.