ollama-mcp-client

MCP.Pizza Chef: mihirrd

Ollama MCP client enables seamless integration of Ollama-based language models with external MCP servers and tools. It supports Git operations, shell commands, and an extensible tool management system, providing an interactive command-line assistant interface. Designed for Python 3.13+, it facilitates real-time interaction between Ollama LLMs and various systems, enhancing multi-step reasoning and workflow automation.

Use This MCP client To

Connect Ollama LLMs to Git repositories via MCP Git server Execute shell commands through Ollama language models Manage and extend tools accessible to Ollama LLMs Run interactive command-line assistant sessions with Ollama models Integrate Ollama LLMs with external MCP servers for complex workflows

README

Ollama MCP (Model Context Protocol)

Ollama MCP is a tool for connecting Ollama-based language models with external tools and services using the Model Context Protocol (MCP). This integration enables LLMs to interact with various systems like Git repositories, shell commands, and other tool-enabled services.

Features

  • Seamless integration between Ollama language models and MCP servers
  • Support for Git operations through MCP Git server
  • Extensible tool management system
  • Interactive command-line assistant interface

Installation

  1. Ensure you have Python 3.13+ installed
  2. Clone this repository
  3. Install dependencies:
# Create a virtual environment
uv add ruff check
# Activate the virtual environment
source .venv/bin/activate
# Install the package in development mode
uv pip install -e .

Usage

Running the Git Assistant

uv run main.py

To run tests

pytest -xvs tests/test_ollama_toolmanager.py

This will start an interactive CLI where you can ask the assistant to perform Git operations.

Extending with Custom Tools

You can extend the system by:

  1. Creating new tool wrappers
  2. Registering them with the OllamaToolManager
  3. Connecting to different MCP servers

Components

  • OllamaToolManager: Manages tool registrations and execution
  • MCPClient: Handles communication with MCP servers
  • OllamaAgent: Orchestrates Ollama LLM and tool usage

Examples

# Creating a Git-enabled agent
git_params = StdioServerParameters(
    command="uvx",
    args=["mcp-server-git", "--repository", "/path/to/repo"],
    env=None
)

# Connect and register tools
async with MCPClient(git_params) as client:
    # Register tools with the agent
    # Use the agent for Git operations

Requirements

  • Python 3.13+
  • MCP 1.5.0+
  • Ollama 0.4.7+

ollama-mcp-client FAQ

How do I install the ollama-mcp-client?
Ensure Python 3.13+ is installed, clone the repo, create a virtual environment, and run 'pip install -e .' in development mode.
What programming language is required for ollama-mcp-client?
Python 3.13 or higher is required to run and develop with ollama-mcp-client.
Can ollama-mcp-client interact with Git repositories?
Yes, it supports Git operations through the MCP Git server integration.
How do I start the interactive command-line assistant?
Run 'uv run main.py' to launch the interactive assistant interface.
Is the tool management system extensible?
Yes, ollama-mcp-client includes an extensible tool management system for adding new capabilities.
How can I run tests for ollama-mcp-client?
Use the command 'pytest -xvs tests/test_ollama_toolmanager.py' to run the test suite.
What LLM providers does ollama-mcp-client support?
It primarily supports Ollama models but can integrate with other MCP-compatible LLMs like OpenAI, Claude, and Gemini through MCP servers.
Does ollama-mcp-client support real-time multi-step reasoning?
Yes, it enables real-time interaction and multi-step reasoning by connecting Ollama LLMs with external MCP servers and tools.