Sponsored by Deepsite.site

Mermaid MCP Server

Created By
andrewginns6 months ago
Python MCP Server abstracting the official mermaid-cli for ease of use
Content

Mermaid MCP Server

A Model Context Protocol (MCP) server for validating Mermaid diagrams.

Implements a minimal Python wrapper over https://github.com/mermaid-js/mermaid-cli for simpler out of the box use.

Overview

Python MCP server for validating Mermaid diagrams and (optionally) rendering them as PNG images. It uses the Mermaid CLI tool to perform the validation and rendering.

The server provides LLMs with structured validation results including:

  • Boolean validation status (is_valid: true/false) indicating whether the Mermaid diagram syntax is correct
  • Detailed error messages explaining exactly what went wrong if validation fails (e.g., syntax errors, unsupported diagram types, malformed nodes)
  • Optional base64-encoded PNG images of successfully rendered diagrams for visual verification

This enables LLMs to programmatically validate Mermaid diagram syntax, understand specific errors to provide helpful corrections, and optionally receive visual confirmation of the rendered output.

Also provides a simple Pydantic-AI MCP Client to invoke the MCP server using a Gemini model for testing.

Prerequisites

Important: This MCP server requires Node.js to be installed on your system, even if you're only using the server component (not the client). The server internally calls npx @mermaid-js/mermaid-cli as a subprocess to perform diagram validation and rendering.

Required Dependencies

  • Node.js with npm (required for all usage)
  • Mermaid CLI: Install with npm install -g @mermaid-js/mermaid-cli
  • Python with uv (for running the MCP server)

Quick Dependency Setup

# Install Mermaid CLI globally
npm install -g @mermaid-js/mermaid-cli

# Verify installation
npx @mermaid-js/mermaid-cli --version

Quickstart

To use this server with an MCP client (like Claude Desktop), add the following configuration to your MCP settings:

Note: Make sure you have Node.js and Mermaid CLI installed (see Prerequisites above) before configuring the MCP server.

Configuration Format

  1. Clone this repository

  2. Add this to your MCP client configuration file (e.g., claude_desktop_config.json):

{
  "mcpServers": {
    "mermaid-validator": {
      "command": "uv",
      "args": ["run", "/path/to/mermaid_mcp_server.py"],
    }
  }
}

Configuration Options

  • command: Use uv to run the server
  • args: Run the server script with uv run
  • cwd: Set to the absolute path of your cloned repository
  • env: Environment variables for the server
    • MCP_TRANSPORT: Set to "stdio" for standard input/output communication

Example Extended Configuration

{
  "mcpServers": {
    "mermaid-validator": {
      "command": "uv", 
      "args": ["run", "/path/to/mermaid_mcp_server.py"],
      "env": {
        "MCP_TRANSPORT": "stdio",
      }
    }
  }
}

Abstractions Over Mermaid-CLI

The Python wrapper significantly simplifies the usage of the Mermaid CLI by abstracting away complex file handling and command-line arguments:

Without the wrapper (raw mermaid-cli):

# Create input file
echo "graph TD; A-->B" > diagram.mmd

# Create puppeteer config file
echo '{"args": ["--no-sandbox", "--disable-setuid-sandbox"]}' > puppeteer-config.json

# Run mermaid-cli with multiple arguments
npx @mermaid-js/mermaid-cli -i diagram.mmd -o output.png --puppeteerConfigFile puppeteer-config.json

# Handle output file and cleanup

With the Python wrapper:

# Simple function call with diagram text
result = await validate_mermaid_diagram("graph TD; A-->B")

# All file handling, configuration, and cleanup is automatic
# Returns structured result with validation status and base64-encoded image

Key Abstractions:

  1. Temporary File Management: Automatically creates and cleans up temporary .mmd input files
  2. Output File Handling: Manages temporary .png output files and converts them to base64 strings
  3. Puppeteer Configuration: Automatically generates the required sandboxing configuration for headless browser rendering
  4. Error Handling: Captures and returns structured error messages instead of raw stderr output
  5. Command Construction: Builds the complete npx @mermaid-js/mermaid-cli command with all necessary flags
  6. Resource Cleanup: Ensures all temporary files are properly deleted after processing

This abstraction allows users to focus on diagram validation and rendering without dealing with the underlying file system operations and command-line complexities.

Local Development

This repository can be used standalone to test the functionality of the Mermaid MCP validator programmatically.

Requirements

See the Prerequisites section above for required dependencies (Node.js, Mermaid CLI, and Python with uv).

Use the provided Makefile for streamlined setup:

# Install all dependencies (Python + Node.js + Mermaid CLI)
make install

# Run validation tests
make test

Manual Setup

If you prefer manual setup:

  1. Clone this repository
  2. Install dependencies: uv sync
  3. Install Mermaid CLI: npm install -g @mermaid-js/mermaid-cli
  4. Copy .env.example to .env and fill in your API key
  5. Run the server: uv run mermaid_mcp_server.py

Usage

The server exposes a tool for validating Mermaid diagrams:

  • validate_mermaid_diagram: Validates a Mermaid diagram and returns validation results

Tool Parameters

  • diagram_text (required): The Mermaid diagram text to validate
  • return_image (optional, default: false): Whether to return the base64-encoded PNG image

Context Length Optimisation

Important: By default, the tool does not return the base64-encoded image (return_image=false) to preserve context length in LLM conversations. Base64-encoded images can be very long strings (often 10KB-100KB+) that significantly impact the available context for the conversation.

When to use each setting:

  • return_image=false (default): Use for diagram validation only. Fast and context-efficient.
  • return_image=true: Use only when you specifically need the rendered image data. Warning: This will consume significant context length.

Example Usage

# Validation only (recommended for most cases)
result = await validate_mermaid_diagram("graph TD; A-->B")
# Returns: MermaidValidationResult(is_valid=True, error_message=None, diagram_image=None)

# Validation with image (use sparingly)
result = await validate_mermaid_diagram("graph TD; A-->B", return_image=True)
# Returns: MermaidValidationResult(is_valid=True, error_message=None, diagram_image="iVBORw0KGgoAAAANSUhEUg...")

Testing

The project includes convenient testing commands:

# Run all tests
make test

# Or run the test script directly
uv run test_pydantic.py

The test script uses Pydantic AI with Gemini models to validate the MCP server functionality.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
WindsurfThe new purpose-built IDE to harness magic
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Serper MCP ServerA Serper MCP Server
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Playwright McpPlaywright MCP server
CursorThe AI Code Editor
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Tavily Mcp
ChatWiseThe second fastest AI chatbot™
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Amap Maps高德地图官方 MCP Server
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
DeepChatYour AI Partner on Desktop
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.