mcp-client-chatbot

MCP.Pizza Chef: cgoinglove

MCP Client Chatbot is an open source, versatile AI chat interface client supporting multiple providers like OpenAI, Anthropic, Google, and Ollama. It enables instant execution in fully local environments without complex setup, giving users full control over computing resources. Built with Vercel AI SDK and Next.js, it leverages the Model Context Protocol to integrate external tools seamlessly into chat workflows, making it ideal for developers and users seeking a flexible, provider-agnostic AI chat client.

Use This MCP client To

Run AI chatbots locally without cloud dependencies Switch between multiple AI providers seamlessly Integrate external tools into chat workflows Deploy AI chat interfaces with minimal configuration Control computing resources on personal servers Experiment with multi-provider AI chat solutions Build custom AI chat applications using MCP Test AI models in isolated local environments

README

MCP Client Chatbot

English | 한국어

Local First MCP Supported

MCP Client Chatbot is a versatile chat interface that supports various AI model providers like OpenAI, Anthropic, Google, and Ollama. It is designed for instant execution in 100% local environments without complex configuration, enabling users to fully control computing resources on their personal computer or server.

Built with Vercel AI SDK and Next.js, this app adopts modern patterns for building AI chat interfaces. Leverage the power of Model Context Protocol (MCP) to seamlessly integrate external tools into your chat experience.

🌟 Open Source Project MCP Client Chatbot is a 100% community-driven open source project.

Table of Contents


Demo

Here are some quick examples of how you can use MCP Client Chatbot:


🧩 Browser Automation with Playwright MCP

playwright-demo

Example: Control a web browser using Microsoft's playwright-mcp tool.

Sample prompt:

Please go to GitHub and visit the cgoinglove profile.
Open the mcp-client-chatbot project.
Then, click on the README.md file.
After that, close the browser.
Finally, tell me how to install the package.

⚡️ Quick Tool Mentions (@)

mention

Quickly call any registered MCP tool during chat by typing @toolname.
No need to memorize — just type @ and pick from the list!

You can also control how tools are used with the new Tool Choice Mode:

  • Auto: Tools are automatically called by the model when needed.
  • Manual: The model will ask for your permission before calling any tool.
  • None: Disables all tool usage.

Toggle modes anytime with the shortcut ⌘P.


🔌 Adding MCP Servers Easily

mcp-server-install

Add new MCP servers easily through the UI, and start using new tools without restarting the app.


🛠️ Standalone Tool Testing

tool-test

MCP tools independently from chat sessions for easier development and debugging.

📊 Built-in Chart Tools

May-04-2025 01-55-04

Visualize chatbot responses as pie, bar, or line charts using the built-in tool — perfect for quick data insight during conversations.


✨ Key Features

  • 💻 100% Local Execution: Run directly on your PC or server without complex deployment, fully utilizing and controlling your computing resources.
  • 🤖 Multiple AI Model Support: Flexibly switch between providers like OpenAI, Anthropic, Google AI, and Ollama.
  • 🛠️ Powerful MCP Integration: Seamlessly connect external tools (browser automation, database operations, etc.) into chat via Model Context Protocol.
  • 🚀 Standalone Tool Tester: Test and debug MCP tools separately from the main chat interface.
  • 💬 Intuitive Mentions + Tool Control: Trigger tools with @, and control when they're used via Auto / Manual / None modes.
  • ⚙️ Easy Server Setup: Configure MCP connections via UI or .mcp-config.json file.
  • 📄 Markdown UI: Communicate in a clean, readable markdown-based interface.
  • 💾 Zero-Setup Local DB: Uses SQLite by default for local storage (PostgreSQL also supported).
  • 🧩 Custom MCP Server Support: Modify the built-in MCP server logic or create your own.
  • 📊 Built-in Chart Tools: Generate pie, bar, and line charts directly in chat with natural prompts.

🚀 Getting Started

This project uses pnpm as the recommended package manager.

# 1. Install dependencies
pnpm i

# 2. Initialize project (creates .env, sets up DB)
pnpm initial

# 3. Start dev server
pnpm dev

Open http://localhost:3000 in your browser to get started.

⚠️Note for Local Production Testing: When testing production builds locally with pnpm build and pnpm start (without a custom domain), you may need to configure NextAuth with trustHost: true and adjust cookie security settings to prevent authentication issues. See issue #30 for the specific changes.


Environment Variables

The pnpm initial command generates a .env file. Add your API keys there:

GOOGLE_GENERATIVE_AI_API_KEY=****
OPENAI_API_KEY=****
# ANTHROPIC_API_KEY=****

SQLite is the default DB (db.sqlite). To use PostgreSQL, set USE_FILE_SYSTEM_DB=false and define POSTGRES_URL in .env.


MCP Server Setup

You can connect MCP tools via:

  1. UI Setup: Go to http://localhost:3000/mcp and configure through the interface.
  2. Direct File Edit: Modify .mcp-config.json in project root.
  3. Custom Logic: Edit ./custom-mcp-server/index.ts to implement your own logic.

💡 Tips & Guides

Here are some practical tips and guides for using MCP Client Chatbot:


🗺️ Roadmap: Next Features

MCP Client Chatbot is evolving with these upcoming features:

🚀 Deployment & Hosting

  • Self Hosting:
    • Easy deployment with Docker Compose
    • Vercel deployment support (MCP Server: SSE only)

🗣️ Audio & Real-Time Chat

  • Open Audio Real-Time Chat:
    • Real-time voice chat with MCP Server integration

📎 File & Image

  • File Attach & Image Generation:
    • File upload and image generation
    • Multimodal conversation support

🔄 MCP Workflow

  • MCP Flow:
    • Workflow automation with MCP Server integration

🛠️ Built-in Tools & UX

  • Default Tools for Chatbot:
    • Collaborative document editing (like OpenAI Canvas: user & assistant co-editing)
    • RAG (Retrieval-Augmented Generation)
    • Useful built-in tools for chatbot UX (usable without MCP)

💻 LLM Code Write (with Daytona)

  • LLM-powered code writing and editing using Daytona integration
    • Seamless LLM-powered code writing, editing, and execution in a cloud development environment via Daytona integration. Instantly generate, modify, and run code with AI assistance—no local setup required.

💡 If you have suggestions or need specific features, please create an issue!


🙌 Contributing

We welcome all contributions! Bug reports, feature ideas, code improvements — everything helps us build the best local AI assistant.

Let’s build it together 🚀

mcp-client-chatbot FAQ

How do I install the MCP Client Chatbot locally?
Clone the GitHub repo, install dependencies with npm or yarn, and run the Next.js app locally.
Can I use multiple AI providers simultaneously?
Yes, the client supports OpenAI, Anthropic, Google, Ollama, and more, allowing easy switching or multi-provider setups.
Does the client require internet access to run?
It supports 100% local execution, so internet is not required if models and tools are hosted locally.
How does MCP Client Chatbot integrate external tools?
It uses the Model Context Protocol to connect and orchestrate external MCP servers and tools within chat sessions.
Is the client customizable for different workflows?
Yes, built on Next.js and Vercel AI SDK, it is highly customizable for various AI chat workflows.
What programming languages and frameworks does it use?
It is built with JavaScript/TypeScript using Next.js and the Vercel AI SDK.
Can I deploy this client on a server?
Yes, it can be deployed on personal or cloud servers for remote access.
How secure is running the client locally?
Running locally ensures data privacy and control over computing resources without cloud exposure.