tome

MCP.Pizza Chef: runebookai

Tome is a client that streamlines the use of local large language models (LLMs) with MCP servers. It eliminates complex setup by managing MCP servers automatically, enabling instant connections to local LLM providers like Ollama. Users can easily find MCP servers via the Smithery marketplace or connect custom servers with simple commands, facilitating quick, seamless chat interactions with MCP-powered models. Tome is designed for developers and users seeking a hassle-free way to integrate local LLMs into their workflows.

Use This MCP client To

Connect local LLMs to MCP servers without manual setup Manage MCP servers via Smithery marketplace integration Instantly chat with MCP-powered models locally Use custom uvx/npx commands to add MCP servers Simplify local AI model experimentation and testing

README

Tome

A magical tool for using local LLMs with MCP servers

Tome Screenshot


Tome

Tome is the simplest way to get started with local LLMs and MCP. Tome manages your MCP servers so there's no fiddling with uv/npm or json files - connect it to Ollama, find an MCP server via our Smithery marketplace integration (or paste your own uvx/npx command), and chat with an MCP-powered model in seconds.

This is a Technical Preview so bear in mind things will be rough around the edges. Join us on Discord to share tips, tricks, and issues you run into. Star this repo to stay on top of updates and feature releases!

Features

  • Instant connection to Ollama (local or remote) for model management
  • Chat with MCP-powered models, customize context window and temperature
  • Install MCP servers by pasting in a command (e.g., uvx mcp-server-fetch) or through the built-in Smithery marketplace which offers thousands of servers via a single click

Getting Started

Requirements

Quickstart

  1. Install Tome and Ollama
  2. Install a Tool supported model (we're partial to Qwen3, either 14B or 8B depending on your RAM)
  3. Open the MCP tab in Tome and install your first MCP server (Fetch is an easy one to get started with, just paste uvx mcp-server-fetch into the server field).
  4. Chat with your MCP-powered model! Ask it to fetch the top story on Hacker News.

Vision

We want to make local LLMs and MCP accessible to everyone. We're building a tool that allows you to be creative with LLMs, regardless of whether you're an engineer, tinkerer, hobbyist, or anyone in between.

Core Principles

  • Tome is local first: You are in control of where your data goes.
  • Tome is for everyone: You shouldn't have to manage programming languages, package managers, or json config files.

What's Next

  • Model support: Currently Tome uses Ollama for model management but we'd like to expand support for other LLM engines and possibly even cloud models, let us know if you have any requests.
  • Operating system support: We're planning on adding support for Windows, followed by Linux.
  • App builder: we believe long term that the best experiences will not be in a chat interface. We have plans to add additional tools that will enable you to create powerful applications and workflows.
  • ??? Let us know what you'd like to see! Join our community via the links below, we'd love to hear from you.

Community

Discord Bluesky Twitter

tome FAQ

How does Tome simplify connecting to local LLMs?
Tome manages MCP servers automatically, removing the need for manual setup with uv/npm or JSON files, enabling instant connections to local LLMs like Ollama.
Can I use Tome with MCP servers not listed in Smithery?
Yes, you can paste your own uvx or npx command to connect any MCP server manually.
Is Tome suitable for production environments?
Tome is currently a Technical Preview, so it may have rough edges and is best suited for experimentation and development.
What local LLM providers does Tome support?
Tome supports Ollama and can connect to any MCP server compatible with local LLMs, including those accessible via Smithery.
How can I get support or share feedback about Tome?
You can join the Tome Discord community via the provided invite link to share tips, report issues, and get help.
Does Tome require coding to manage MCP servers?
No, Tome automates MCP server management, so no coding or manual configuration is needed.
Can Tome connect to multiple MCP servers simultaneously?
Yes, Tome can manage and connect to multiple MCP servers for diverse local LLM interactions.