Trino MCP Server logo

Trino MCP Server

by stinkgen

The Trino MCP Server provides AI models with structured access to Trino's distributed SQL query engine. It exposes Trino resources through the Model Context Protocol (MCP), enabling AI tools to query and analyze data in Trino.

View on GitHub

Last updated: N/A

What is Trino MCP Server?

The Trino MCP Server is a tool that allows AI models to interact with and query data stored in a Trino database using the Model Context Protocol (MCP). It provides a structured way for AI to access and analyze data within Trino.

How to use Trino MCP Server?

To use the server, you can start it using Docker Compose or run the standalone Python API. Once running, you can interact with it through a command-line interface or REST API. The server exposes endpoints for executing SQL queries, allowing LLMs to generate and run queries against the Trino instance.

Key features of Trino MCP Server

  • Exposes Trino resources through MCP protocol

  • Enables AI tools to query and analyze data in Trino

  • Provides both Docker container API and standalone Python API server options

  • Includes demo and validation scripts for testing and showcasing functionality

Use cases of Trino MCP Server

  • Allowing LLMs to query Trino databases directly

  • Generating complex SQL queries based on natural language prompts

  • Performing data analysis and presenting results in a user-friendly format

  • Creating visualizations of data using Mermaid charts

FAQ from Trino MCP Server

What is MCP?

MCP stands for Model Context Protocol. It's a protocol that allows AI models to interact with data sources in a structured way.

What transport options are supported?

The server supports STDIO and SSE transport. STDIO is the recommended and reliable option. SSE has known issues and is not recommended.

How do I start the server?

You can start the server using Docker Compose (docker-compose up -d) or by running the standalone Python API (python llm_trino_api.py).

What ports does the server use?

The Docker container API uses port 9097, and the standalone Python API uses port 8008 by default. Trino is available on port 9095 and the MCP server on 9096.

How can LLMs use this server?

LLMs can use the server to get database schema information, run complex analytical queries, and perform data analysis by sending SQL queries to the API endpoints.