Sponsored by Deepsite.site

MCP-RAG: Modular RAG Pipeline using MCP & GroundX

Created By
sujithadr6 months ago
MCP server Implantation for RAG (GroundX API)
Content

MCP-RAG: Modular RAG Pipeline using MCP & GroundX

A production-ready Retrieval-Augmented Generation setup

License Python

🚀 Overview

MCP-RAG is a modular, production-grade implementation of a Retrieval-Augmented Generation (RAG) system, powered by:

  • 🧠 MCP (Model Context Protocol) for standardized tool orchestration
  • 🔍 GroundX for semantic search, ingestion, and vector store operations
  • 🤖 OpenAI GPT-4 for LLM-powered contextual response generation

It allows clean separation of responsibilities across ingestion, search, generation, and tool discovery — making it scalable, flexible, and enterprise-ready.

📌 Developed by Sujith Somanunnithan for teams building AI-driven applications with reusable components.


📦 Features

  • 🔧 Modular Tool Design using MCP server interface
  • 🧩 YAML-Based Prompt Templates with Jinja2 rendering
  • 📂 PDF File Ingestion into GroundX vector store
  • 🔍 Real-Time Semantic Search via GroundX Search Tool
  • 🤝 Plug-and-Play API Integration for new tools and services

📁 Project Structure

mcp-rag/
├── server.py                     # MCP Server initialization
├── config.py                     # Environment and config management
├── ingestion.py                  # File ingestion tool logic
├── search.py                     # Search + LLM generation logic
├── prompts.yaml                  # Prompt template in Jinja2
├── models.py                     # Pydantic models for configs
├── .env                          # API keys (excluded from version control)
├── pyproject.toml                # Project config for uv / MCP
├── README.md                     # This file

🧠 Architectural Flow

  1. User query arrives at the MCP server
  2. Server routes it to the Search Tool
  3. Search Tool queries GroundX API
  4. Snippets are rendered via YAML prompt
  5. OpenAI API generates final LLM response

🔄 All tools are discoverable and invocable via MCP dynamically.


🔑 Environment Setup

Create .env with your keys:

OPENAI_API_KEY=your-openai-key
GROUNDEX_API_KEY=your-groundx-key

Install using uv:

uv pip install -r pyproject.toml

⚙️ Usage

Start the server:

mcp dev server.py

Ingest a PDF:

mcp call ingest_documents --args '{"file_path": "data/sample.pdf"}'

Search with a query:

mcp call process_search_query --args '{"query": "What is explained in section 3?"}'

📌 Clean Separation of Concerns

RoleComponent
Tool discovery/invokeMCP Server
Search executionGroundX API
Response generationOpenAI API
File uploadIngest Tool (MCP)

📚 License

This project is licensed under the MIT License.


👨‍💻 Author

Sujith Somanunnithan
Cloud & AI Architect | sujith.de


💬 Feedback & Contributions

Feel free to raise issues, pull requests, or connect with the author for improvements or extensions (like multi-file ingestion, RAG fallback chains, etc).

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Playwright McpPlaywright MCP server
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Serper MCP ServerA Serper MCP Server
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Amap Maps高德地图官方 MCP Server
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
WindsurfThe new purpose-built IDE to harness magic
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
ChatWiseThe second fastest AI chatbot™
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Tavily Mcp
CursorThe AI Code Editor
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
DeepChatYour AI Partner on Desktop
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。