Sponsored by Deepsite.site

📚 Simple MCP Info Server

Created By
yuvraj18988 months ago
Content

📚 Simple MCP Info Server

A FastMCP-powered microservice that provides search tools for Wikipedia and Arxiv using the LangChain Community utilities.


This project demonstrates how to create and run an MCP (Model Context Protocol) server using FastMCP. It provides simple tools that Claude Desktop can use to search Wikipedia and fetch academic papers from Arxiv.


🚀 What is MCP?

Model Control Protocol (MCP) is a protocol that allows language models like Claude to communicate with external tools (called "capabilities") in a standardized way. These capabilities can be:

  • Functions written in any language
  • Web APIs
  • Local scripts

Using MCP, Claude can invoke your tools, get the response, and continue the conversation.


📚 What This Project Does

This Info Server exposes two tools via MCP:

  1. get_info(searchterm: str) — Searches Wikipedia for short summaries
  2. get_research_paper(searchterm: str) — Searches Arxiv for academic paper metadata

⚡ Overview

This tool exposes two FastMCP-compatible endpoints:

  • get_info: Fetches summarized information from Wikipedia.
  • get_research_paper: Retrieves academic paper details from Arxiv.

Designed for plug-and-play use in modular AI systems or agent runtimes.


🧰 Setup Guide

✅ Prerequisites

  • Python 3.12+
  • uv installed

    Install via curl -LsSf https://astral.sh/uv/install.sh | sh or brew install astral-sh/uv/uv

    Windows powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

  • Claude Desktop

    'Claude Desktop' installed


🛠️ Project Initialization

uv init simple_mcp_server
cd simple_mcp_server
uv venv
source .venv/bin/activate

📦 Install Dependencies

uv add "mcp[cli]"
uv add langchain_community
uv add wikipedia
uv add arxiv

server.py (FastMCP) Runs your Wikipedia + Arxiv search server locally.

🚀 Run the Server Ensure you’re in the virtual environment, then:

uv run server.py

The server will run over stdio, ready to be called as an MCP tool.Just verify if there are any error.

Option 1: Update claude_desktop_config.json (for Claude Desktop)

Run this command to open the config file in VS Code (or use any editor):

code ~/Library/Application\ Support/Claude/claude_desktop_config.json

Replace or add the following mcpServers block:

{
  "mcpServers": {
    "info-server": {
      "command": "/Users/yuvrajfirodiya/.local/bin/uv",
      "args": [
        "--directory",
        "/Users/yuvrajfirodiya/Source/Python-Langchain-Projects/simple_mcp_server",
        "run",
        "server.py"
      ]
    }
  }
}

✅Verify the following:

command: Make sure it points to your uv binary (which uv in terminal to confirm).

args: Points to the root of your MCP project.

"run", "server.py": You’re running the correct server file.

Or

Option 2: Use a Custom MCP Client (Host)

You can also run the server using a custom MCP client built with the mcpclient library.

This is useful for advanced workflows, testing, or when integrating with your own LLM setup.

📄 client.py includes the full server specification and config Set up the client using a structured config method like this: import asyncio import os from dotenv import load_dotenv from langchain_groq import ChatGroq from mcp_use import MCPAgent, MCPClient

import asyncio
import os
from dotenv import load_dotenv
from langchain_groq import ChatGroq
from mcp_use import MCPAgent, MCPClient

async def main():
    # Load environment variables (e.g., GROQ_API_KEY)
    load_dotenv()

    # Define the MCP server config
    config = {
        "mcpServers": {
            "info-server": {
                "command": "/Users/yuvrajfirodiya/.local/bin/uv",
                "args": [
                    "--directory",
                    "/Users/yuvrajfirodiya/Source/Python-Langchain-Projects/simple_mcp_server",
                    "run",
                    "server.py"
                ],
                "env": {
                    "DISPLAY": ":1"
                }
            }
        }
    }

    # Initialize the client
    client = MCPClient.from_dict(config)

    # Initialize the Groq LLM (Llama 3)
    llm = ChatGroq(
        model_name="Llama3-8b-8192",
        streaming=True
    )

    # Build your agent
    agent = MCPAgent(llm=llm, client=client, max_steps=30)

    # Ask a question
    result = await agent.run("tell me about donald trump")
    print(f"\nResult: {result}")

if __name__ == "__main__":
    asyncio.run(main())

✅ Benefits of This Approach 🔓 No dependency on Claude Desktop UI

🧩 Easily swap out LLMs (Groq, OpenAI, etc.)

🧠 Full control over how your agent interacts with tools

⚙️ Configurable, scriptable, and scalable

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
ChatWiseThe second fastest AI chatbot™
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
WindsurfThe new purpose-built IDE to harness magic
DeepChatYour AI Partner on Desktop
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Amap Maps高德地图官方 MCP Server
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Tavily Mcp
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Playwright McpPlaywright MCP server
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
CursorThe AI Code Editor
Serper MCP ServerA Serper MCP Server
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"