XiYan MCP Server logo

XiYan MCP Server

by XGenerationLab

XiYan MCP Server is a Model Context Protocol (MCP) server that enables natural language queries to databases. It is powered by XiYan-SQL, a state-of-the-art text-to-SQL model.

View on GitHub

Last updated: N/A

What is XiYan MCP Server?

XiYan MCP Server is a server that allows users to query databases using natural language. It leverages the XiYan-SQL model to translate natural language queries into SQL, which is then executed against the database.

How to use XiYan MCP Server?

  1. Install the server using pip or Smithery.ai. 2. Configure the server with the necessary LLM and database connection details in a YAML config file. 3. Launch the server and integrate it with tools like Claude Desktop, Cline, Goose, or Cursor. 4. Use natural language to query the database through the integrated tool.

Key features of XiYan MCP Server

  • Fetch data by natural language through XiYanSQL

  • Support general LLMs (GPT, qwenmax), Text-to-SQL SOTA model

  • Support pure local mode (high security!)

  • Support MySQL and PostgreSQL

  • List available tables as resources

  • Read table contents

Use cases of XiYan MCP Server

  • Building a local data assistant

  • Providing a natural language interface for database querying

  • Integrating database access into various applications

  • Simplifying data retrieval for non-technical users

FAQ from XiYan MCP Server

What is the recommended model to use?

XiYanSQL-qwencoder-32B is recommended for best performance and stability.

How do I get an API key for the Modelscope version?

Apply for a key of API-inference from Modelscope: https://www.modelscope.cn/docs/model-service/API-Inference/intro

How do I configure the server for local mode?

Install the required Python packages, download the local model, and configure the config.yml file with the local model's URL.

What databases are supported?

Currently, MySQL and PostgreSQL are supported.

Where can I get help if the server doesn't work?

Contact the developers through the Ding Group or Weibo (links provided in the README).