Sponsored by Deepsite.site

Xiyan Mcp Server

Created By
XGenerationLab9 months ago
Content

XiYan MCP Server

MCP Playwright

A Model Context Protocol (MCP) server that enables natural language queries to databases
powered by XiYan-SQL, SOTA of text-to-sql on open benchmarks

💻 XiYan-mcp-server | 🌐 XiYan-SQL | 📖 Arxiv | 📄 PapersWithCode 💻 HuggingFace | 🤖 ModelScope | 🌕 析言GBI
License: Apache 2.0 PyPI Downloads Smithery Installs GitHub stars
English | 中文
Ding Group钉钉群Follow me on Weibo

Table of Contents

Features

  • 🌐 Fetch data by natural language through XiYanSQL
  • 🤖 Support general LLMs (GPT,qwenmax), Text-to-SQL SOTA model
  • 💻 Support pure local mode (high security!)
  • 📝 Support MySQL and PostgreSQL.
  • 🖱️ List available tables as resources
  • 🔧 Read table contents

Preview

Architecture

There are two ways to integrate this server in your project, as shown below: The left is remote mode, which is the default mode. It requires an API key to access the xiyanSQL-qwencoder-32B model from service provider (see Configuration). Another mode is local mode, which is more secure. It does not require an API key.

architecture.png

Best practice

Build a local data assistant using MCP + Modelscope API-Inference without writing a single line of code

Tools Preview

  • The tool get_data provides a natural language interface for retrieving data from a database. This server will convert the input natural language into SQL using a built-in model and call the database to return the query results.

  • The {dialect}://{table_name} resource allows obtaining a portion of sample data from the database for model reference when a specific table_name is specified.

  • The {dialect}:// resource will list the names of the current databases

Installation

Installing from pip

Python 3.11+ is required. you can install the server through pip, and it will install the latest verion

pip install xiyan-mcp-server

After that you can directly run the server by:

python -m xiyan_mcp_server

But it does not provide any functions until you complete following config. You will get a yml file. After that you can run the server by:

env YML=path/to/yml python -m xiyan_mcp_server

Installing from Smithery.ai

See @XGenerationLab/xiyan_mcp_server

Not fully tested.

Configuration

You need a yml config file to configure the server. a default config file is provided in config_demo.yml which looks like this:

model:
  name: "XGenerationLab/XiYanSQL-QwenCoder-32B-2412"
  key: ""
  url: "https://api-inference.modelscope.cn/v1/"

database:
  host: "localhost"
  port: 3306
  user: "root"
  password: ""
  database: ""

LLM Configuration

Name is the name of the model to use, key is the API key of the model, url is the API url of the model. We support following models.

versionsgeneral LLMs(GPT,qwenmax)SOTA model by ModelscopeSOTA model by DashscopeLocal LLMs
descriptionbasic, easy to usebest performance, stable, recommandbest performance, for trialslow, high-security
namethe official model name (e.g. gpt-3.5-turbo,qwen-max)XGenerationLab/XiYanSQL-QwenCoder-32B-2412xiyansql-qwencoder-32bxiyansql-qwencoder-3b
keythe API key of the service provider (e.g. OpenAI, Alibaba Cloud)the API key of modelscopethe API key via email""
urlthe endpoint of the service provider (e.g."https://api.openai.com/v1")https://api-inference.modelscope.cn/v1/https://xiyan-stream.biz.aliyun.com/service/api/xiyan-sqlhttp://localhost:5090

General LLMs

if you want to use the general LLMs, e.g. gpt3.5, you can directly config like this:

model:
  name: "gpt-3.5-turbo"
  key: "YOUR KEY "
  url: "https://api.openai.com/v1"
database:

if you want to use Qwen from alibaba, e.g. Qwen-max, you can use following config.

model:
  name: "qwen-max"
  key: "YOUR KEY "
  url: "https://dashscope.aliyuncs.com/compatible-mode/v1"
database:

Text-to-SQL SOTA model

We recommend the XiYanSQL-qwencoder-32B (https://github.com/XGenerationLab/XiYanSQL-QwenCoder), which is the SOTA model in text-to-sql, see Bird benchmark. There are two ways to use the model. You can use either of them. (1) Modelscope, (2) Alibaba Cloud DashScope.

(1) Modelscope version

You need to apply a key of API-inference from Modelscope, https://www.modelscope.cn/docs/model-service/API-Inference/intro Then you can use the following config:

model:
  name: "XGenerationLab/XiYanSQL-QwenCoder-32B-2412"
  key: ""
  url: "https://api-inference.modelscope.cn/v1/"

Read our model description for more details.

(2) Dashscope version

We deployed the model on Alibaba Cloud DashScope, so you need to set the following environment variables: Send me your email to get the key. ( godot.lzl@alibaba-inc.com ) In the email, please attach the following information:

name: "YOUR NAME",
email: "YOUR EMAIL",
organization: "your college or Company or Organization"

We will send you a key according to your email. And you can fill the key in the yml file. The key will be expired by 1 month or 200 queries or other legal restrictions.

model:
  name: "xiyansql-qwencoder-32b"
  key: "KEY"
  url: "https://xiyan-stream.biz.aliyun.com/service/api/xiyan-sql"
database:

Note: this model service is just for trial, if you need to use it in production, please contact us.

Alternatively, you can also deploy the model XiYanSQL-qwencoder-32B on your own server.

Local Model

Note: local model is slow (about 12 seconds per query on my macbook). If your need stable and fast service, we still recommend to use the modelscope version.

To run xiyan_mcp_server on local mode, you need

  1. a PC/Mac with at least 16GB RAM
  2. 6GB disk space

step1: Install additional python packages

pip install flask modelscope torch==2.2.2 accelerate>=0.26.0 numpy=2.2.3

step2: (optional) manully download the model We recommand xiyansql-qwencoder-3b. You can manully download the model by

modelscope download --model XGenerationLab/XiYanSQL-QwenCoder-3B-2502

It will take you 6GB disk space.

step4: download the script and run server. src/xiyan_mcp_server/local_xiyan_server.py

python local_xiyan_server.py

The server will be running on http://localhost:5090/

step4: prepare config and run xiyan_mcp_server the config.yml should be like:

model:
  name: "xiyansql-qwencoder-3b"
  key: "KEY"
  url: "http://127.0.0.1:5090"

Til now the local mode is ready.

Database Configuration

host, port, user, password, database are the connection information of the database.

You can use local or any remote databases. Now we support MySQL and PostgreSQL(more dialects soon).

MySQL

database:
  host: "localhost"
  port: 3306
  user: "root"
  password: ""
  database: ""

PostgreSQL

step1: Install python packages

pip install psycopg2

step2: prepare the config.yml like this:

database:
  dialect: "postgresql"
  host: "localhost"
  port: 5432
  user: ""
  password: ""
  database: ""

Note that dialect should be postgresql for postgresql.

Launch

Claude desktop

Add this in your claude desktop config file, ref claude desktop config example

{
    "mcpServers": {
        "xiyan-mcp-server": {
            "command": "python",
            "args": [
                "-m",
                "xiyan_mcp_server"
            ],
            "env": {
                "YML": "PATH/TO/YML"
            }
        }
    }
}

Cline

prepare the config like Claude desktop

Goose

Add following command in the config, ref goose config example

env YML=path/to/yml python -m xiyan_mcp_server

Cursor

Use the same command like Goose .

Witsy

Add following in command.

python -m xiyan_mcp_server

Add an env: key is YML and value is the path to your yml file. Ref witsy config example

It does not work!

contact us: Ding Group钉钉群Follow me on Weibo

Citation

If you find our work helpful, feel free to give us a cite.

@article{xiyansql,
      title={A Preview of XiYan-SQL: A Multi-Generator Ensemble Framework for Text-to-SQL}, 
      author={Yingqi Gao and Yifu Liu and Xiaoxia Li and Xiaorong Shi and Yin Zhu and Yiming Wang and Shiqi Li and Wei Li and Yuntao Hong and Zhiling Luo and Jinyang Gao and Liyu Mou and Yu Li},
      year={2024},
      journal={arXiv preprint arXiv:2411.08599},
      url={https://arxiv.org/abs/2411.08599},
      primaryClass={cs.AI}
}

Server Config

{
  "mcpServers": {
    "xiyan-mcp-server": {
      "command": "python",
      "args": [
        "-m",
        "xiyan_mcp_server"
      ],
      "env": {
        "YML": "PATH/TO/YML"
      }
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Amap Maps高德地图官方 MCP Server
ChatWiseThe second fastest AI chatbot™
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Serper MCP ServerA Serper MCP Server
Playwright McpPlaywright MCP server
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
CursorThe AI Code Editor
DeepChatYour AI Partner on Desktop
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
WindsurfThe new purpose-built IDE to harness magic
Tavily Mcp
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.