Sponsored by Deepsite.site

mcp-server-mariadb-vector

Created By
DavidRamosSal7 months ago
MCP server for MariaDB
Content

mcp-server-mariadb-vector

The MariaDB Vector MCP server provides tools that LLM agents can use to interact with a MariaDB database with vector support, providing users with a natural language interface to store and interact with their data. Thanks to the Model Context Protocol (MCP), this server is compatible with any MCP client, including those provided by applications like Claude Desktop and Cursor/Windsurf, as well as LLM Agent frameworks like LangGraph and PydanticAI.

Using the MariaDB Vector MCP server, users can for example:

  • Provide context from a knowledge-base to their conversations with LLM agents
  • Store and query their conversations with LLM agents

Features

  • Vector Store Management

    • Create and delete vector stores in a MariaDB database
    • List all vector stores in a MariaDB database
  • Document Management

    • Add documents with optional metadata to a vector store
    • Query a vector store using semantic search
  • Embedding Provider

    • Use OpenAI's embedding models to embed documents

MCP Tools

  • mariadb_create_vector_store: Create a vector store in a MariaDB database
  • mariadb_delete_vector_store: Delete a vector store in a MariaDB database
  • mariadb_list_vector_stores: List all vector stores in a MariaDB database
  • mariadb_insert_documents: Add documents with optional metadata to a vector store
  • mariadb_search_vector_store: Query a vector store using semantic search

Setup

Note: From here on, it is assumed that you have a running MariaDB instance with vector support (version 11.7 or higher). If you don't have one, you can quickly spin up a MariaDB instance using Docker:

docker run -p 3306:3306 --name mariadb-instance -e MARIADB_ROOT_PASSWORD=password -e MARIADB_DATABASE=database_name mariadb:11.7

First clone the repository:

git clone https://github.com/DavidRamosSal/mcp-server-mariadb-vector.git

There are two ways to run the MariaDB Vector MCP server: as a Python package using uv or as a Docker container built from the provided Dockerfile.

Requirements for running the server using uv

Requirements for running the server as a Docker container

Configuration

The server needs to be configured with the following environment variables:

NameDescriptionDefault Value
MARIADB_HOSThost of the running MariaDB database127.0.0.1
MARIADB_PORTport of the running MariaDB database3306
MARIADB_USERuser of the running MariaDB databaseNone
MARIADB_PASSWORDpassword of the running MariaDB databaseNone
MARIADB_DATABASEname of the running MariaDB databaseNone
EMBEDDING_PROVIDERprovider of the embedding modelsopenai
EMBEDDING_MODELmodel of the embedding providertext-embedding-3-small
OPENAI_API_KEYAPI key for OpenAI's platformNone

Running the server using uv

Using uv, you can add a .env file to the root of the cloned repository with the environment variables and run the server with the following command:

uv run --dir path/to/mcp-server-mariadb-vector/ --env-file path/to/mcp-server-mariadb-vector/.env mcp_server_mariadb_vector

The dependencies will be installed automatically. An optional --transport argument can be added to specify the transport protocol to use. The default value is stdio.

Running the server as a Docker container

Build the Docker container from the root directory of the cloned repository by running the following command:

docker build -t mcp-server-mariadb-vector .

Then run the container (replace with your own configuration):

docker run -p 8000:8000 \
  --add-host host.docker.internal:host-gateway \
  -e MARIADB_HOST="host.docker.internal" \
  -e MARIADB_PORT="port" \
  -e MARIADB_USER="user" \
  -e MARIADB_PASSWORD="password" \
  -e MARIADB_DATABASE="database" \
  -e EMBEDDING_PROVIDER="openai" \
  -e EMBEDDING_MODEL="embedding-model" \
  -e OPENAI_API_KEY="your-openai-api-key" \
  mcp-server-mariadb-vector

The server will be available at http://localhost:8000/sse, using the SSE transport protocol. Make sure to leave MARIADB_HOST set to host.docker.internal if you are running the MariaDB database as a Docker container on your host machine.

Integration with Claude Desktop | Cursor | Windsurf

Claude Desktop, Cursor and Windsurf can run and connect to the server automatically using stdio transport. To do so, add the following to your configuration file (claude_desktop_config.json for Claude Desktop, mcp.json for Cursor or mcp_config.json for Windsurf):

{
  "mcpServers": {
    "mariadb-vector": {
      "command": "uv",
      "args": [
        "run",
        "--directory",
        "path/to/mcp-server-mariadb-vector/",
        "--env-file",
        "path/to/mcp-server-mariadb-vector/.env",
        "mcp-server-mariadb-vector"
      ]
    }
  }
}

Alternatively, Cursor and Windsurf can connect to an already running server on your host machine (e.g. if you are running the server as a Docker container) using SSE transport. To do so, add the following to the corresponding configuration file:

  "mcpServers": {
    "mariadb-vector": {
      "url": "http://localhost:8000/sse"
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
CursorThe AI Code Editor
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
DeepChatYour AI Partner on Desktop
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
ChatWiseThe second fastest AI chatbot™
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Serper MCP ServerA Serper MCP Server
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Playwright McpPlaywright MCP server
Amap Maps高德地图官方 MCP Server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
WindsurfThe new purpose-built IDE to harness magic
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Tavily Mcp