Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

Letta-MCP-server

MCP.Pizza Chef: oculairmedia

Letta-MCP-server is a dedicated MCP server that manages agent operations, memory handling, and seamless integration with the Letta system. It supports real-time communication with agents and provides modular tools for extending functionality. Designed for easy deployment via Node.js or Docker, it enables developers to build robust AI workflows with Letta's ecosystem.

Use This MCP server To

Manage Letta agents and their lifecycle Perform memory operations for agent context Integrate Letta system features into AI workflows Deploy MCP server with Node.js or Docker Enable real-time communication with AI agents Extend functionality via modular tool implementations

README

MseeP.ai Security Assessment Badge

Letta MCP Server

A server that provides tools for agent management, memory operations, and integration with the Letta system.

Quick Setup

Option 1: Run with Node.js

# Development (with hot reload)
npm run dev:sse     # SSE transport

# Production
npm run build       # Build TypeScript first
npm run start:sse   # SSE transport

Option 2: Run with Docker

# Build and run locally
docker build -t letta-mcp-server .
docker run -d -p 3001:3001 -e PORT=3001 -e NODE_ENV=production --name letta-mcp letta-mcp-server

# Or use the public image
docker run -d -p 3001:3001 -e PORT=3001 -e NODE_ENV=production --name letta-mcp ghcr.io/oculairmedia/letta-mcp-server:latest

Directory Structure

  • index.js - Main entry point
  • core/ - Core server functionality
  • tools/ - Individual tool implementations
  • transports/ - Server transport implementations (stdio and SSE)

Available Tools

Agent Management

Tool Description Required Parameters Optional Parameters
create_agent Create a new Letta agent name, description model, embedding
list_agents List all available agents - filter
prompt_agent Send a message to an agent agent_id, message -
get_agent Get agent details by ID agent_id -
modify_agent Update an existing agent agent_id, update_data -
delete_agent Delete an agent agent_id -
clone_agent Clone an existing agent source_agent_id, new_agent_name override_existing_tools, project_id
bulk_delete_agents Delete multiple agents - agent_ids, agent_name_filter, agent_tag_filter

Memory Management

Tool Description Required Parameters Optional Parameters
list_memory_blocks List all memory blocks - filter, agent_id, page, pageSize, label
create_memory_block Create a new memory block name, label, value agent_id, metadata
read_memory_block Read a memory block block_id agent_id
update_memory_block Update a memory block block_id value, metadata, agent_id
attach_memory_block Attach memory to an agent block_id, agent_id label

Tool Management

Tool Description Required Parameters Optional Parameters
list_tools List all available tools - filter, page, pageSize
list_agent_tools List tools for a specific agent agent_id -
attach_tool Attach tools to an agent agent_id tool_id, tool_ids, tool_names
upload_tool Upload a new tool name, description, source_code category, agent_id
bulk_attach_tool_to_agents Attach a tool to multiple agents tool_id agent_name_filter, agent_tag_filter

Additional Tools

  • Model Management: list_llm_models, list_embedding_models
  • Archive Management: list_passages, create_passage, modify_passage, delete_passage
  • MCP Server Management: list_mcp_servers, list_mcp_tools_by_server
  • Import/Export: export_agent, import_agent

Docker Operations

# View container logs
docker logs -f letta-mcp

# Stop the container
docker stop letta-mcp

# Update to latest version
docker pull ghcr.io/oculairmedia/letta-mcp-server:latest
docker stop letta-mcp
docker rm letta-mcp
docker run -d -p 3001:3001 -e PORT=3001 -e NODE_ENV=production --name letta-mcp ghcr.io/oculairmedia/letta-mcp-server:latest

Configuration with MCP Settings

Add the server to your mcp_settings.json:

"letta": {
  "command": "node",
  "args": [
    "--no-warnings",
    "--experimental-modules",
    "path/to/letta-server/index.js"
  ],
  "env": {
    "LETTA_BASE_URL": "https://your-letta-instance.com",
    "LETTA_PASSWORD": "yourPassword"
  },
  "disabled": false,
  "alwaysAllow": [
    "upload_tool",
    "attach_tool",
    "list_agents",
    "list_memory_blocks"
  ],
  "timeout": 300
}

For remote instances, use the URL configuration:

"remote_letta_tools": {
  "url": "http://your-server:3001/sse",
  "disabled": false,
  "alwaysAllow": [
    "attach_tool", 
    "list_agents",
    "list_tools",
    "get_agent"
  ],
  "timeout": 120
}

Letta-MCP-server FAQ

How do I deploy the Letta-MCP-server?
You can deploy it using Node.js with npm scripts or via Docker containers for easy setup.
What functionalities does the Letta-MCP-server provide?
It manages agents, handles memory operations, and integrates with the Letta system for AI workflows.
Can I extend the Letta-MCP-server with custom tools?
Yes, the server supports modular tool implementations located in the 'tools/' directory.
How does the server communicate with agents?
It uses SSE (Server-Sent Events) transport for real-time communication with agents.
Is the Letta-MCP-server production-ready?
Yes, it supports production deployment with build and start scripts and Docker images.
Where can I find the main entry point of the server?
The main entry point is the 'index.js' file in the root directory.
Does the server support hot reload during development?
Yes, running 'npm run dev:sse' enables development mode with hot reload.
What environment variables are required?
Common variables include PORT and NODE_ENV to configure server port and environment.