Local AI agents.
See the docs/
directory.
See docs/config.md for configuration settings.
~/.config/sauropod/config.toml
# Run the server on port 8080
port = 8080
# Point the backend to an OpenAI-compatible server like Ollama.
backend = "http://localhost:11434"
[default_model]
model = "gemma3:27b"
type = "Gemma3"
[[mcp_servers]]
# Spawn an MCP server as a subprocess controlled by the server
command = ["docker", "run", "-it", "--rm", "markitdown-mcp:latest"]
[[mcp_servers]]
# Connect to a remote MCP server
url = "http://localhost:1234"
- MCP tools support
- Image support
- Events
- Notifications via Web Push
- Multiple accounts
- Secrets management
- Access policies (possibly using Cedar)
- Automatically generated SDKs for workflows
- Clang
- Node.js
- Rust and Cargo
libssl
make
pkg-config
make release
The binary will be created in target/optimized-release/sauropod-server
.
Most of the code is licensed under AGPL.
The code required to build custom clients - such as the schemas, client APIs, and OpenAPI specification - is licensed under Apache-2.0.