Sponsored by Deepsite.site

MCP Server for Up-to-Date Library Documentation

Created By
AmirUpSkill7 months ago
Content

MCP Server for Up-to-Date Library Documentation

Python Version License Built with uv

This project implements a Model Context Protocol (MCP) server in Python. Its primary function is to provide Large Language Models (LLMs) like Anthropic's Claude with real-time access to the latest documentation for specified Python libraries (Langchain, LlamaIndex, OpenAI) before they generate code suggestions.

Problem Solved

LLMs often possess knowledge based on their training data cutoffs. This can lead to outdated code suggestions, especially for rapidly evolving libraries common in the AI/ML space. This MCP server addresses this challenge by acting as a tool that allows the LLM to dynamically fetch and incorporate the most current documentation snippets into its context before responding to coding queries.

Features

  • MCP Standard: Implements the Model Context Protocol for seamless integration with compatible clients (e.g., Claude Desktop, Claude Code).
  • get_docs Tool: Exposes a specific tool that searches official documentation sites.
  • Targeted Search: Uses the Serper API to perform site-specific Google searches, ensuring results come directly from the official docs for:
    • Langchain (python.langchain.com/docs)
    • LlamaIndex (docs.llamaindex.ai/en/stable)
    • OpenAI (platform.openai.com/docs)
  • Content Fetching: Retrieves and parses the text content from the top search results using httpx and BeautifulSoup.
  • Modern Tooling: Built with Python 3.11+, asyncio, FastMCP, and managed using the uv package manager.

Architecture Overview

This server functions as a specialized "toolbox" within the MCP ecosystem:

  1. An MCP Host (e.g., Claude Desktop, IDE with Claude Code) initiates a request requiring coding assistance for a supported library.
  2. The MCP Client within the Host connects to this running MCP Server.
  3. The LLM, recognizing the need for potentially up-to-date information, decides to use the get_docs tool provided by this server.
  4. The Client invokes the get_docs tool on this server, passing the user's query and the target library.
  5. This MCP Server constructs a site-specific search query (e.g., site:python.langchain.com/docs <user_query>).
  6. It queries the Serper API to get the top documentation page links.
  7. It fetches the content of these pages using httpx and extracts the relevant text using BeautifulSoup.
  8. The extracted text (context) is returned to the MCP Client/Host.
  9. The LLM uses this fresh context alongside the original prompt to generate a more accurate and up-to-date response/code suggestion.

Prerequisites

  • Python 3.11+
  • uv Package Manager: Install from Astral.sh.
  • Serper API Key: Obtain a free or paid key from serper.dev.
  • Node.js/npx: Required only if you plan to use the MCP Inspector for debugging.

Installation & Setup

  1. Clone the Repository (if applicable):

    git clone <your-repository-url>
    cd <your-repository-name>
    
  2. Initialize Project (if starting fresh):

    # If you haven't cloned a repo with pyproject.toml
    uv init mcp-server
    cd mcp-server
    
  3. Create and Activate Virtual Environment:

    uv venv
    # Activate (Linux/macOS):
    source .venv/bin/activate
    # Activate (Windows PowerShell):
    . \.venv\Scripts\Activate.ps1
    # Activate (Windows Cmd):
    .\.venv\Scripts\activate.bat
    
  4. Install Dependencies:

    uv add "mcp[cli]" httpx python-dotenv bs4
    # Or, if dependencies are listed in pyproject.toml:
    # uv sync
    

Configuration

  1. Create a file named .env in the root directory of the project.

  2. Add your Serper API key to this file:

    SERPER_API_KEY=your_actual_serper_api_key_here
    

    (The .gitignore file is already configured to prevent committing this file)

Usage

  1. Run the MCP Server: Make sure your virtual environment is activated.

    uv run main.py
    

    The server will start and listen for connections via standard input/output (stdio), as configured in main.py.

  2. Integrate with MCP Clients:

    • Claude Desktop:

      • Go to Settings > Developer > Edit Configuration.
      • Add an entry under mcpServers. You'll need to provide the full path to your uv executable and specify the command arguments.
      • Example structure (adjust paths accordingly):
        {
          "mcpServers": [
            {
              "name": "docs-helper", // Or any name you prefer
              "command": [
                "/full/path/to/your/.venv/bin/python", // Or full path to uv binary
                "-m", // If using python -m uv ...
                "uv",
                "run",
                "main.py"
               ],
              "workingDirectory": "/full/path/to/your/mcp-server/project"
            }
          ]
        }
        
      • Restart Claude Desktop. A tool hammer icon should appear.
    • Claude Code (CLI):

      • Use the claude mcp add command interactively or with flags.
      • Example interactive session prompts:
        • Server Name: documentation-fetcher (or your choice)
        • Project Type: local
        • Command: Specify the full path to uv and arguments, similar to Claude Desktop (e.g., /full/path/to/uv run main.py within the project directory).
        • Working Directory: /full/path/to/your/mcp-server/project
      • Use claude mcp list to verify.
      • Run claude - the tool should be listed.
    • Refer to the official Anthropic MCP documentation for the most up-to-date client configuration details.

Development & Debugging

The MCP Inspector is a valuable tool for testing your server's capabilities without needing a full client integration.

  1. Ensure Node.js and npx are installed.
  2. Run the inspector, pointing it to your server's run command:
    # Ensure your .venv is activated first
    npx @model-context-protocol/inspector "uv run main.py"
    
  3. Open your web browser to http://localhost:5173.
  4. Connect to the server via the Inspector interface.
  5. Navigate to the "Tools" section, select get_docs, provide test values for query and library, and click "Run Tool" to see the output.

License

This project is licensed under the MIT License - see the LICENSE file for details. (You'll need to add a LICENSE file with the MIT license text if you choose this)


Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
DeepChatYour AI Partner on Desktop
Playwright McpPlaywright MCP server
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Amap Maps高德地图官方 MCP Server
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
ChatWiseThe second fastest AI chatbot™
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Serper MCP ServerA Serper MCP Server
WindsurfThe new purpose-built IDE to harness magic
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Tavily Mcp
CursorThe AI Code Editor