Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

MCP-on-AWS-Bedrock

MCP.Pizza Chef: davidshtian

MCP-on-AWS-Bedrock is a server implementation demonstrating how to integrate Anthropic's Model Context Protocol (MCP) with AWS Bedrock. It provides clear, practical examples and client implementations to interact with MCP-enabled tools via AWS Bedrock's runtime service. Designed for developers with Python 3.10+, it requires an AWS account with Bedrock access and configured credentials. The project includes client code for both stdio and SSE modes, facilitating real-time, structured context exchange between LLMs and external tools. This server example helps developers understand and implement MCP workflows on AWS Bedrock, supporting secure, scalable, and provider-agnostic AI integrations.

Use This MCP server To

Integrate Anthropic MCP tools with AWS Bedrock runtime Develop MCP clients using Python for AWS Bedrock Demonstrate stdio and SSE communication modes for MCP Enable real-time context exchange with LLMs on AWS Prototype MCP workflows on AWS Bedrock environment

README

MCP on AWS Bedrock

A simple and clear example for implementation and understanding Anthropic MCP (on AWS Bedrock).

For multiple MCP servers management, this tiny project Q-2001 could be referred~

Overview

This project demonstrates how to implement and use Anthropic's Model Context Protocol (MCP) with AWS Bedrock. It provides a client implementation that can interact with MCP-enabled tools through AWS Bedrock's runtime service.

Updates 2025-05-10: Streamable HTTP

  • Add support for Streamable HTTP
  • Rewrite the URL fetching MCP server fetch_url_mcp_server.py that demonstrates different transport types

Usage Instructions

Run the server with default stdio settings (no transport parameter):

uv run fetch_url_mcp_server.py

# client
uv run client_stdio.py

Run with streamable-http transport on default port (8000):

python fetch_url_mcp_server.py --transport streamable-http

# client
uv run client_streamablehttp.py

Run with streamable-http transport on custom port:

python fetch_url_mcp_server.py --transport streamable-http --port 8080

Prerequisites

  • Python 3.10 or higher
  • AWS account with Bedrock access
  • Configured AWS credentials
  • UV package manager

Features

  • Seamless integration with AWS Bedrock runtime using Converse API
  • Tool format conversion for Bedrock compatibility
  • Asynchronous communication handling
  • Structured logging for debugging

Contributing

Feel free to submit issues and pull requests to improve the implementation.

License

MIT License

References

MCP-on-AWS-Bedrock FAQ

What programming language is used in MCP-on-AWS-Bedrock?
MCP-on-AWS-Bedrock is implemented using Python 3.10 or higher.
What AWS service does MCP-on-AWS-Bedrock utilize?
It uses AWS Bedrock to run and manage MCP-enabled models and tools.
Do I need an AWS account to use MCP-on-AWS-Bedrock?
Yes, an AWS account with Bedrock access and configured credentials is required.
What communication modes are supported by the MCP clients in this project?
The project supports stdio and server-sent events (SSE) modes for client-server communication.
Can MCP-on-AWS-Bedrock be used with LLM providers other than Anthropic?
While this example focuses on Anthropic MCP, AWS Bedrock supports multiple LLM providers like OpenAI, Anthropic, and AI21 Labs.
Is MCP-on-AWS-Bedrock suitable for production use?
It is primarily a clear example and reference implementation, ideal for learning and prototyping MCP integrations on AWS Bedrock.
How do I manage multiple MCP servers with this project?
The README suggests using the Q-2001 project for managing multiple MCP servers efficiently.