Sponsored by Deepsite.site

Graphiti Mcp Pro

Created By
itcook4 months ago
Base on Graphiti, Enhanced Core Capabilities, Broader AI Model Compatibility, and Comprehensive Management UI.
Content

Graphiti MCP Pro

English | 中文

About Graphiti

Graphiti is a framework for building and querying temporally-aware knowledge graphs, specifically tailored for AI agents operating in dynamic environments. Unlike traditional retrieval-augmented generation (RAG) methods, Graphiti continuously integrates user interactions, structured and unstructured enterprise data, and external information into a coherent, queryable graph. The framework supports incremental data updates, efficient retrieval, and precise historical queries without requiring complete graph recomputation, making it suitable for developing interactive, context-aware AI applications.

This project is an enhanced memory repository MCP service and management platform based on Graphiti. Compared to the original project's MCP service, it offers the following core advantages: enhanced core capabilities, broader AI model compatibility, and comprehensive visual management interface.

Features

Enhanced Core Capabilities

Asynchronous Parallel Processing

Adding memories is the core functionality of the MCP service. We have introduced an asynchronous parallel processing mechanism based on the original implementation. The same group ID (such as different development projects) can execute up to 5 adding memory tasks in parallel, significantly improving processing efficiency.

Task Management Tools

Four new MCP tools have been added for managing add_memory tasks:

  • list_add_memory_tasks - List all add_memory tasks
  • get_add_memory_task_status - Get add_memory task status
  • wait_for_add_memory_task - Wait for add_memory task completion
  • cancel_add_memory_task - Cancel add_memory task

Unified Configuration Management

Optimized configuration management to resolve inconsistencies between command-line parameters, environment variables, and management backend database configurations.

NOTE

When the management backend is enabled, MCP service parameters in the .env environment configuration file only take effect during the initial startup. Subsequent configurations will be based on parameters in the management backend database.

Broader AI Model Compatibility and Flexibility

Enhanced Model Compatibility

Through integration with the instructor library, model compatibility has been significantly improved. Now supports various models such as DeepSeek, Qwen, and even locally run models through Ollama, vLLM, as long as they provide OpenAI API compatible interfaces.

Separated Model Configuration

The original unified LLM configuration has been split into three independent configurations, allowing flexible combinations based on actual needs:

  • Large Model (LLM): Responsible for entity and relationship extraction
  • Small Model (Small LLM): Handles entity attribute summarization, relationship deduplication, reranking, and other lightweight tasks
  • Embedding Model (Embedder): Dedicated to text vectorization

NOTE

When configuring the embedding model, note that its API path differs from the two LLMs above. LLMs use the chat completion path {base_url}/chat/completions, while text embedding uses {base_url}/embeddings. If you select "Same as Large Model" in the management backend, ensure your configured large model supports text embedding.

Additionally, if you run the service via docker compose while the LLM or embedding model is running locally, the base_url needs to be configured as http://host.docker.internal:{port}, where the port should be adjusted according to your local running port.

Comprehensive Management Platform

manager-ui-en

To provide better user experience and observability, we have developed a complete management backend and Web UI. Through the management interface, you can:

  • Service Control: Start, stop, restart MCP service
  • Configuration Management: Real-time configuration updates and adjustments
  • Usage Monitoring: View detailed token usage statistics
  • Log Viewing: Real-time and historical log queries

Getting Started

  1. Clone Project

    git clone http://github.com/itcook/graphiti-mcp-pro
    # or git clone git@github.com:itcook/graphiti-mcp-pro.git
    cd graphiti-mcp-pro
    
  2. Configure Environment Variables (Optional)

    # Copy example configuration file
    mv .env.example.en .env
    # Edit .env file according to the instructions
    

NOTE

If you want to continue using the previous Graphiti MCP data, set the NEO4J_ related parameters in the .env file to your Neo4j database connection information, and keep other parameters as default.

  1. Start Services

    docker compose up -d
    

TIP

If the project has updates and you need to rebuild the image, use docker compose up -d --build.

Rest assured, data will be persistently saved in the external database and will not be lost.

  1. Access Management Interface Default address: http://localhost:6062

Manual Installation

NOTE

Prerequisites:

  1. Python 3.10+ and uv project manager
  2. Node.js 20+
  3. Accessible Neo4j 5.26+ database service
  4. AI model service
  1. Clone Project

    git clone http://github.com/itcook/graphiti-mcp-pro
    # or git clone git@github.com:itcook/graphiti-mcp-pro.git
    cd graphiti-mcp-pro
    
  2. Install Dependencies

    uv sync
    
  3. Configure Environment Variables

    # Copy example configuration file
    mv .env.example.en .env
    # Edit .env file according to the instructions
    
  4. Run MCP Service

    # Run service with management backend
    uv run main.py -m
    # Or run MCP service only
    # uv run main.py
    
  5. Build and Run Management Frontend

    Enter frontend directory and install dependencies:

    cd manager/frontend
    pnpm install  # or npm install / yarn
    

    Build and run frontend:

    pnpm run build   # or npm run build / yarn build
    pnpm run preview # or npm run preview / yarn preview
    

    Access management interface: http://localhost:6062

Important Notes

Known Limitations

  • 🔒 Security Notice: The management backend does not implement authorization access mechanisms. DO NOT expose the service on public servers.
  • 🧪 Test Coverage: Due to resource constraints, the project has not been thoroughly tested. Recommended for personal use only.
  • 📡 Transport Protocol: Only supports streamable-http transport protocol. Removed stdio and sse support from the original project.
  • ⚙️ Code Optimization: Some architectural designs (dependency injection, exception handling, client decoupling, etc.) still have room for optimization.

Usage Recommendations

  • Configuration Instructions: Please carefully read the setup instructions and comments in .env.example.en
  • Model Selection: If using natively supported models like GPT/Gemini/Claude and don't need detailed runtime information, consider using the original Graphiti MCP
  • Issue Feedback: Welcome to submit Issues or Pull Requests for any usage problems

Developed with assistance from 🤖 Augment Code

Server Config

{
  "mcpServers": {
    "graphiti_pro": {
      "transport": "http",
      "url": "http://localhost:8082/mcp"
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
ChatWiseThe second fastest AI chatbot™
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
DeepChatYour AI Partner on Desktop
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Serper MCP ServerA Serper MCP Server
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
WindsurfThe new purpose-built IDE to harness magic
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
CursorThe AI Code Editor
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Amap Maps高德地图官方 MCP Server
Tavily Mcp
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Playwright McpPlaywright MCP server