scira-mcp-chat

MCP.Pizza Chef: zaidmukaddam

Scira MCP Chat is an open-source, minimalistic MCP client built with Next.js and Vercel's AI SDK. It supports streaming text responses and seamless integration with multiple AI providers and MCP servers. Featuring multiple transport types and built-in tool integration, it enables flexible, real-time AI chat experiences with reasoning model support and a modern UI.

Use This MCP client To

Stream AI chat responses from multiple providers Connect to various MCP servers via SSE or stdio Integrate and extend AI capabilities with built-in tools Build AI chatbots with reasoning model support Switch AI providers with minimal code changes Create real-time interactive AI chat applications

README

An open-source AI chatbot app powered by Model Context Protocol (MCP), built with Next.js and the AI SDK by Vercel.

FeaturesMCP ConfigurationLicense


Features

  • Streaming text responses powered by the AI SDK by Vercel, allowing multiple AI providers to be used interchangeably with just a few lines of code.
  • Full integration with Model Context Protocol (MCP) servers to expand available tools and capabilities.
  • Multiple MCP transport types (SSE and stdio) for connecting to various tool providers.
  • Built-in tool integration for extending AI capabilities.
  • Reasoning model support.
  • shadcn/ui components for a modern, responsive UI powered by Tailwind CSS.
  • Built with the latest Next.js App Router.

MCP Server Configuration

This application supports connecting to Model Context Protocol (MCP) servers to access their tools. You can add and manage MCP servers through the settings icon in the chat interface.

Adding an MCP Server

  1. Click the settings icon (⚙️) next to the model selector in the chat interface.
  2. Enter a name for your MCP server.
  3. Select the transport type:
    • SSE (Server-Sent Events): For HTTP-based remote servers
    • stdio (Standard I/O): For local servers running on the same machine

SSE Configuration

If you select SSE transport:

  1. Enter the server URL (e.g., https://mcp.example.com/token/sse)
  2. Click "Add Server"

stdio Configuration

If you select stdio transport:

  1. Enter the command to execute (e.g., npx)

  2. Enter the command arguments (e.g., -y @modelcontextprotocol/server-google-maps)

    • You can enter space-separated arguments or paste a JSON array
  3. Click "Add Server"

  4. Click "Use" to activate the server for the current chat session.

Available MCP Servers

You can use any MCP-compatible server with this application. Here are some examples:

  • Composio - Provides search, code interpreter, and other tools
  • Zapier MCP - Provides access to Zapier tools
  • Any MCP server using stdio transport with npx and python3

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

scira-mcp-chat FAQ

How does Scira MCP Chat support multiple AI providers?
It uses the AI SDK by Vercel, allowing interchangeable use of providers like OpenAI, Claude, and Gemini with minimal code changes.
What MCP transport types does Scira MCP Chat support?
It supports Server-Sent Events (SSE) and stdio transports for flexible server connections.
Can I extend Scira MCP Chat with custom tools?
Yes, it has built-in tool integration to add and manage custom AI capabilities.
What frameworks and technologies is Scira MCP Chat built on?
It is built with Next.js and leverages the AI SDK by Vercel for AI interactions.
Does Scira MCP Chat support reasoning models?
Yes, it includes support for reasoning models to enhance AI response quality.
Is Scira MCP Chat open source?
Yes, it is an open-source project available on GitHub for community use and contributions.