Sponsored by Deepsite.site

Model Context Protocol (MCP)

Created By
PouyaEsmaeili8 months ago
Model Context Protocol (MCP)
Content

Model Context Protocol (MCP)

Model Context Protocol (MCP) is a protocol designed to establish a communication channel between large language models (LLMs) and external tools or data sources. It follows a client-server architecture and uses JSON-RPC 2.0 for messaging. Although MCP was introduced only a few months ago by Anthropic, it has quickly gained significant attention in the AI community. In the sections below, I’ll explain the MCP specification and provide some Python examples.


MCP Server

An MCP server provides three key capabilities to the client:

CapabilityDescription
ToolA function or service that performs logic (e.g., a calculator), triggers side effects (e.g., modifies the environment), or is CPU-bound.
ResourceAny type of data such as images, files, etc.
PromptA prompt template that can be used by the client.

An MCP server can be implemented using either FastMCP or Low-Level APIs. It supports communication via SSE over HTTP (Server-Sent Events) or stdin. Use stdin for local communication and SSE for network-based communication. Type of communication is called Transport Layer in MCP specification.

FastMCP

Suppose we want to implement an MCP server for an English teaching application based on a LLM. This MCP server will provide an assessment quiz (Resource), a triage function to determine the learner's level (Tool), and a prompt template (Prompt).

from mcp.server.fastmcp import FastMCP

# You can customize important parameters by passing them to FastMCP.
# In this example, all key parameters are set to their default values,
# but you can modify them according to your needs.
# sse_url = http://0.0.0.0:8000/sse
mcp = FastMCP(
    name="MCPServer",
    debug=True,
    host="0.0.0.0",
    port=8000,
    sse_path="/sse",
    message_path="/messages/",
    log_level="DEBUG",
)


@mcp.resource(
    uri="https://quiz.xyz",
    name="GetQuiz",
    description="Provides a link to an online English level assessment quiz.",
)
def get_quiz() -> str:
    return "Link to online quiz: https://quiz.xyz"


@mcp.tool(
    name="FindLevel",
    description="Determines the student's English level based on their quiz score.",
)
def find_level(grade: int) -> str:
    if grade < 50:
        return "Beginner"
    if grade < 75:
        return "Intermediate"
    return "Expert"


@mcp.prompt(
    name="GetPrompt",
    description="Generates a prompt to ask an LLM to teach English based on the student's level.",
)
def get_prompt(name: str, level: str) -> str:
    return f"Teach {name} English based on this level: {level}."

There are different ways to serve FastMCP over SSE: sse_app, run_sse_async or run. In the first approach, you use a ASGI framework (e.x. Starlette). The second approach utilizes Python's asyncio. And the third approach is a mcp's instance built-in function which utilizes run_sse_async under the hood.

from starlette.applications import Starlette
from starlette.routing import Mount
import uvicorn


if __name__ == "__main__":
    app = Starlette(debug=True, routes=[
        Mount("/", mcp.sse_app()),
    ])
    uvicorn.run(app)
import asyncio

async def main():
    await mcp.run_sse_async()


if __name__ == "__main__":
    asyncio.run(main())
# Pass transport=stdio for stdio server
if __name__ == "__main__":
    mcp.run(transport="sse")

Low Level APIs

Implementing a server using low level APIs takes much time. You have to instantiate Server class and implement a bunch of methods to list and call all the capabilities. There is a detailed guide on how to do this in mcp python-sdk's official repo here.


MCP Client

Based on how server is exposed there two ways to implement client: sdio-client and sse-client.

SSE Client

Set transport to sse in fastmcp-server.py.

import asyncio
import logging
from typing import Union
from urllib.parse import urlparse

from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream

import mcp.types as types
from mcp.client.session import ClientSession
from mcp.client.sse import sse_client
from mcp.shared.session import RequestResponder

logging.basicConfig(level=logging.INFO)
logger = logging.getLogger("client")


async def message_handler(
    message: Union[
        RequestResponder[types.ServerRequest, types.ClientResult],
        types.ServerNotification,
        Exception,
    ],
) -> None:
    if isinstance(message, Exception):
        logger.error("Error: %s", message)
        return
    logger.info("Received message from server: %s", message)


async def run_session(
    read_stream: MemoryObjectReceiveStream,
    write_stream: MemoryObjectSendStream,
) -> None:
    async with ClientSession(
        read_stream,
        write_stream,
        message_handler=message_handler,
    ) as session:
        initialize_response = await session.initialize()
        logger.info(f"Initialized response: {initialize_response}")
        tools = await session.list_tools()
        tools = tools.tools
        logger.info(f"List of tools: {[tool.name for tool in tools]}")
        response = await session.call_tool("FindLevel", {"grade": 86})
        logger.info(f"Find level response: {response}")


async def main(command_or_url: str) -> None:
    if urlparse(command_or_url).scheme in ("http", "https"):
        async with sse_client(command_or_url) as streams:
            await run_session(*streams)


if __name__ == "__main__":
    sse_url = "http://0.0.0.0:8000/sse"  # change url
    asyncio.run(main(sse_url))

Stdio Client

You have to pass server script to run stdio client (set transport to stdio in fastmcp-server.py):

python stdio.py fastmcp-server.py
import asyncio
import sys
import logging
from contextlib import AsyncExitStack
from typing import Optional

from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client

logging.basicConfig(level=logging.INFO)
logger = logging.getLogger("client")


class MCPClient:
    def __init__(self) -> None:
        self.session: Optional[ClientSession] = None
        self.exit_stack = AsyncExitStack()

    async def connect_to_server(self, server_script_path: str) -> None:
        if not server_script_path.endswith(".py"):
            raise ValueError("Server script must be a .py file")

        server_params = StdioServerParameters(
            command="python",
            args=[server_script_path],
            env=None,
        )

        stdio_transport = await self.exit_stack.enter_async_context(
            stdio_client(server_params)
        )
        self.stdio, self.write = stdio_transport

        self.session = await self.exit_stack.enter_async_context(
            ClientSession(self.stdio, self.write)
        )
        initialize_response = await self.session.initialize()
        logger.info(f"Initialized response: {initialize_response}")

    async def list_tools(self) -> None:
        if not self.session:
            raise RuntimeError("Session is not initialized.")
        response = await self.session.list_tools()
        tools = response.tools
        logger.info(f"List of tools: {[tool.name for tool in tools]}")

    async def find_level(self, grade: int) -> None:
        response = await self.session.call_tool("FindLevel", {"grade": grade})
        logger.info(f"Find level response: {response}")

    async def cleanup(self) -> None:
        await self.exit_stack.aclose()


async def main() -> None:
    if len(sys.argv) < 2:
        print("Usage: python client.py <path_to_server_script>")
        sys.exit(1)

    client = MCPClient()
    try:
        await client.connect_to_server(sys.argv[1])
        await client.list_tools()
        await client.find_level(86)
    except Exception as exc:
        print("Error:", exc)
    finally:
        await client.cleanup()


if __name__ == "__main__":
    asyncio.run(main())

Protocol Specification

Three type of messages are exchanged between client and server. All messages are formatted using JSON-RPC 2.0 to enable remote procedure calls.

Standard JSON-RPC error codes:

  • PARSE_ERROR = -32700
  • INVALID_REQUEST = -32600
  • METHOD_NOT_FOUND = -32601
  • INVALID_PARAMS = -32602
  • INTERNAL_ERROR = -32603

Initialization

Client should initialize a successful connection with the server before messaging.

  1. The client sends initialization request (InitializeRequest).
  2. The server responds with its protocol version and capabilities (InitializeResult).
  3. The client sends an initialized notification to acknowledge (InitializedNotification).
  4. Connection is established.

Discovery

Client can:

Invocation

Client can:


References

  1. Introducing the Model Context Protocol
  2. MCP Python SDK/ Github
  3. Model Context Protocol/ Github
  4. Model Context Protocol Documentation
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
WindsurfThe new purpose-built IDE to harness magic
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
DeepChatYour AI Partner on Desktop
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Serper MCP ServerA Serper MCP Server
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Playwright McpPlaywright MCP server
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Tavily Mcp
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
CursorThe AI Code Editor
Amap Maps高德地图官方 MCP Server
ChatWiseThe second fastest AI chatbot™