Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

starlette_mcp_sse

MCP.Pizza Chef: panz2018

starlette_mcp_sse is a Starlette-based MCP server implementation using Server-Sent Events (SSE) to provide real-time, event-driven context delivery and interaction for AI models. It demonstrates how to integrate MCP with a lightweight ASGI framework, enabling models to receive continuous updates and interact with external data sources and tools via a standardized protocol.

Use This MCP server To

Stream real-time context updates to AI models using SSE Implement MCP protocol in Starlette web applications Enable AI models to interact with external tools via event streams Build lightweight MCP servers for AI context delivery Demonstrate MCP integration with Python ASGI frameworks Provide continuous data feeds to LLMs in real time

README

Starlette MCP SSE

English | 简体中文

A Server-Sent Events (SSE) implementation using Starlette framework with Model Context Protocol (MCP) integration.

What is MCP?

The Model Context Protocol (MCP) is an open standard that enables AI models to interact with external tools and data sources. MCP solves several key challenges in AI development:

  • Context limitations: Allows models to access up-to-date information beyond their training data
  • Tool integration: Provides a standardized way for models to use external tools and APIs
  • Interoperability: Creates a common interface between different AI models and tools
  • Extensibility: Makes it easy to add new capabilities to AI systems without retraining

This project demonstrates how to implement MCP using Server-Sent Events (SSE) in a Starlette web application.

Description

This project demonstrates how to implement Server-Sent Events (SSE) using the Starlette framework while integrating Model Context Protocol (MCP) functionality. The key feature is the seamless integration of MCP's SSE capabilities within a full-featured Starlette web application that includes custom routes.

Features

  • Server-Sent Events (SSE) implementation with MCP
  • Starlette framework integration with custom routes
  • Unified web application with both MCP and standard web endpoints
  • Customizable route structure
  • Clean separation of concerns between MCP and web functionality

Architecture

This project showcases a modular architecture that:

  1. Integrates MCP SSE endpoints (/sse and /messages/) into a Starlette application
  2. Provides standard web routes (/, /about, /status, /docs)
  3. Demonstrates how to maintain separation between MCP functionality and web routes

Installation & Usage Options

Prerequisites

Install UV Package Manager - A fast Python package installer written in Rust:

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

Option 1: Quick Run Without Installation

Run the application directly without cloning the repository using UV's execution tool:

uvx --from git+https://github.com/panz2018/starlette_mcp_sse.git start

Option 2: Full Installation

Create Virtual Environment

Create an isolated Python environment for the project:

uv venv
Activate Virtual Environment

Activate the virtual environment to use it:

.venv\Scripts\activate
Install Dependencies

Install all required packages:

uv pip install -r pyproject.toml
Start the Integrated Server

Launch the integrated Starlette server with MCP SSE functionality:

python src/server.py

or

uv run start

Available Endpoints

After starting the server (using either Option 1 or Option 2), the following endpoints will be available:

Debug with MCP Inspector

For testing and debugging MCP functionality, use the MCP Inspector:

mcp dev ./src/weather.py

Connect to MCP Inspector

  1. Open MCP Inspector at http://localhost:5173
  2. Configure the connection:

Test the Functions

  1. Navigate to Tools section
  2. Click List Tools to see available functions:
    • get_alerts : Get weather alerts
    • get_forcast : Get weather forecast
  3. Select a function
  4. Enter required parameters
  5. Click Run Tool to execute

Extending the Application

Adding Custom Routes

The application structure makes it easy to add new routes:

  1. Define new route handlers in routes.py
  2. Add routes to the routes list in routes.py
  3. The main application will automatically include these routes

Customizing MCP Integration

The MCP SSE functionality is integrated in server.py through:

  • Creating an SSE transport
  • Setting up an SSE handler
  • Adding MCP routes to the Starlette application

Integration with Continue

To use this MCP server with the Continue VS Code extension, add the following configuration to your Continue settings:

{
  "experimental": {
    "modelContextProtocolServers": [
      {
        "transport": {
          "name": "weather",
          "type": "sse",
          "url": "http://localhost:8000/sse"
        }
      }
    ]
  }
}

starlette_mcp_sse FAQ

How does starlette_mcp_sse deliver context updates to models?
It uses Server-Sent Events (SSE) to push real-time context updates from the server to connected AI models.
What is the advantage of using SSE in this MCP server?
SSE provides a lightweight, efficient, and persistent connection for streaming data, ideal for real-time MCP context delivery.
Can starlette_mcp_sse be integrated with other Python web frameworks?
While designed for Starlette, the SSE MCP approach can be adapted to other ASGI-compatible frameworks with some modifications.
Does starlette_mcp_sse support multiple concurrent model connections?
Yes, it supports multiple clients connecting simultaneously to receive MCP context streams.
Is starlette_mcp_sse suitable for production use?
It is a working example intended for demonstration and prototyping; production use may require additional robustness and security enhancements.
How does starlette_mcp_sse handle MCP protocol compliance?
It implements the MCP specification for context streaming over SSE, ensuring interoperability with compliant clients and tools.
What programming language is starlette_mcp_sse written in?
It is implemented in Python using the Starlette ASGI framework.
Can starlette_mcp_sse be used with different LLM providers?
Yes, it is provider-agnostic and can work with models from OpenAI, Anthropic Claude, Google Gemini, and others.