MCP-on-AWS-Bedrock

MCP.Pizza Chef: davidshtian

MCP-on-AWS-Bedrock is a server implementation demonstrating how to integrate Anthropic's Model Context Protocol (MCP) with AWS Bedrock. It provides clear, practical examples and client implementations to interact with MCP-enabled tools via AWS Bedrock's runtime service. Designed for developers with Python 3.10+, it requires an AWS account with Bedrock access and configured credentials. The project includes client code for both stdio and SSE modes, facilitating real-time, structured context exchange between LLMs and external tools. This server example helps developers understand and implement MCP workflows on AWS Bedrock, supporting secure, scalable, and provider-agnostic AI integrations.

Use This MCP server To

Integrate Anthropic MCP tools with AWS Bedrock runtime Develop MCP clients using Python for AWS Bedrock Demonstrate stdio and SSE communication modes for MCP Enable real-time context exchange with LLMs on AWS Prototype MCP workflows on AWS Bedrock environment

README

MCP on AWS Bedrock

A simple and clear example for implementation and understanding Anthropic MCP (on AWS Bedrock).

For multiple MCP servers management, this tiny project Q-2001 could be referred~

Overview

This project demonstrates how to implement and use Anthropic's Model Context Protocol (MCP) with AWS Bedrock. It provides a client implementation that can interact with MCP-enabled tools through AWS Bedrock's runtime service.

Prerequisites

  • Python 3.10 or higher
  • AWS account with Bedrock access
  • Configured AWS credentials
  • UV package manager

Project Structure

  • client_stdio.py: Main client implementation for interacting with Bedrock and MCP tools using stdio mode
  • client_sse.py: Main client implementation for interacting with Bedrock and MCP tools using sse mode
  • mcp_simple_tool/: Directory containing the MCP tool implementation
    • server.py: MCP tool server implementation
    • __main__.py: Entry point for the tool
  • pyproject.toml: Project dependencies and configuration

Usage

Run the stdio client with:

uv pip install boto3
uv run client_stdio.py

The client will:

  1. Initialize a connection to AWS Bedrock
  2. Start the MCP tool server
  3. List available tools and convert them to the format required by Bedrock
  4. Handle communication between Bedrock and the MCP tools

Run the sse client with:

# server
uv pip install boto3 uvicorn
uv run mcp-simple-tool --transport sse --port 8000

# client
uv run client_sse.py

Features

  • Seamless integration with AWS Bedrock runtime using Converse API
  • Tool format conversion for Bedrock compatibility
  • Asynchronous communication handling
  • Structured logging for debugging

Contributing

Feel free to submit issues and pull requests to improve the implementation.

License

MIT License

References

MCP-on-AWS-Bedrock FAQ

What programming language is used in MCP-on-AWS-Bedrock?
MCP-on-AWS-Bedrock is implemented using Python 3.10 or higher.
What AWS service does MCP-on-AWS-Bedrock utilize?
It uses AWS Bedrock to run and manage MCP-enabled models and tools.
Do I need an AWS account to use MCP-on-AWS-Bedrock?
Yes, an AWS account with Bedrock access and configured credentials is required.
What communication modes are supported by the MCP clients in this project?
The project supports stdio and server-sent events (SSE) modes for client-server communication.
Can MCP-on-AWS-Bedrock be used with LLM providers other than Anthropic?
While this example focuses on Anthropic MCP, AWS Bedrock supports multiple LLM providers like OpenAI, Anthropic, and AI21 Labs.
Is MCP-on-AWS-Bedrock suitable for production use?
It is primarily a clear example and reference implementation, ideal for learning and prototyping MCP integrations on AWS Bedrock.
How do I manage multiple MCP servers with this project?
The README suggests using the Q-2001 project for managing multiple MCP servers efficiently.