Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

mcp-server-trino

MCP.Pizza Chef: Dataring-engineering

The mcp-server-trino is an MCP server that integrates Trino, a distributed SQL query engine, with the Model-Control-Protocol. It allows listing Trino tables as MCP resources, reading table contents, and executing arbitrary SQL queries through a Python interface. This server facilitates real-time, structured access to big data analytics environments using Trino's Python client, making it ideal for AI models to interact with large datasets efficiently.

Use This MCP server To

List available Trino tables as MCP resources Execute arbitrary SQL queries on Trino clusters Read and stream table contents from Trino Integrate Trino data querying into AI workflows Enable real-time data access for LLMs via Trino Bridge big data analytics with AI model context Automate data retrieval from Trino for reports Support multi-step reasoning with live Trino data

README

Trino MCP Server

This repository provides an MCP (Model-Control-Protocol) server that allows you to list and query tables via Trino using Python.

Overview

  • MCP: MCP is a protocol for bridging AI models, data, and tools. This example MCP server provides:
    • A list of Trino tables as MCP resources
    • Ability to read table contents through MCP
    • A tool for executing arbitrary SQL queries against Trino
  • Trino: A fast, distributed SQL query engine for big data analytics. This server makes use of Trino’s Python client (trino.dbapi) to connect to a Trino host, catalog, and schema.

Requirements

  • Python 3.9+ (or a version compatible with mcp, trino, and asyncio)
  • trino (the Python driver for Trino)
  • mcp (the Model-Control-Protocol Python library)

Configuration

The server reads Trino connection details from environment variables:

Variable Description Default
TRINO_HOST Trino server hostname or IP localhost
TRINO_PORT Trino server port 8080
TRINO_USER Trino user name required
TRINO_PASSWORD Trino password (optional, depends on your authentication setup) (empty)
TRINO_CATALOG Default catalog to use (e.g., hive, tpch, postgresql, etc.) required
TRINO_SCHEMA Default schema to use (e.g., default, public, etc.) required

Usage

{
  "mcpServers": {
    "trino": {
      "command": "uv",
      "args": [
        "--directory", 
        "<path_to_mcp_server_trino>",
        "run",
        "mcp_server_trino"
      ],
      "env": {
        "TRINO_HOST": "<host>",
        "TRINO_PORT": "<port>",
        "TRINO_USER": "<user>",
        "TRINO_PASSWORD": "<password>",
        "TRINO_CATALOG": "<catalog>",
        "TRINO_SCHEMA": "<schema>"
      }
    }
  }
}

mcp-server-trino FAQ

How do I configure the Trino connection for this MCP server?
Configure Trino connection details via environment variables such as host, port, catalog, and schema before starting the server.
What Python versions are supported by mcp-server-trino?
It supports Python 3.9 and above, compatible with the mcp, trino, and asyncio libraries.
Can I execute custom SQL queries through this MCP server?
Yes, it provides a tool to run arbitrary SQL queries against the connected Trino instance.
Does this server support asynchronous query execution?
Yes, it leverages Python's asyncio for efficient asynchronous operations with Trino.
What dependencies are required to run mcp-server-trino?
You need Python 3.9+, the trino Python driver, and the MCP Python library installed.
How does this MCP server expose Trino tables to AI models?
It lists Trino tables as MCP resources, allowing models to query and read table data directly.
Is this MCP server suitable for big data analytics environments?
Yes, it is designed to integrate Trino's distributed SQL engine with AI workflows for large-scale data access.
Can this server be used with multiple Trino catalogs and schemas?
Yes, connection details including catalog and schema are configurable via environment variables.