Sponsored by Deepsite.site

Model Context Protocol (MCP)

Created By
Techiral8 months ago
🚀 OpenClient- The CLI-Based Universal AI Application Connector! An open-source Model Context Protocol (MCP) implementation that turbocharges LLMs by context provisioning standardization. Quickly connect a server of your choice with our client to boost your AI capabilities. Ideal for developers creating next-generation AI applications!
Content

Model Context Protocol (MCP)

MCP is an open protocol that standardizes how applications provide context to LLMs - think of it like USB-C for AI applications. It enables seamless connection between AI models and various data sources/tools.

🔌 Why MCP?

MCP helps build agents and complex workflows on top of LLMs by providing:

  • Pre-built integrations for your LLM to plug into
  • Flexibility to switch between LLM providers
  • Secure data handling best practices
  • Standardized interface for AI applications

🏗️ Core Components

flowchart LR
    A[MCP Host] --> B[MCP Client]
    B --> C[Terminal]
    B --> D[Filesystem]
    B --> E[Memory]
    C --> F[Local Data]
    D --> G[Local Files]
    E --> H[Remote APIs]
  1. MCP Hosts: Applications (like Claude Desktop, IDEs) that need AI context
  2. MCP Clients: Protocol handlers that manage server connections
  3. MCP Servers: Lightweight programs exposing specific capabilities:
    • Terminal Server: Execute commands
    • Filesystem Server: Access local files
    • Memory Server: Persistent data storage
  4. Data Sources:
    • Local: Files, databases on your machine
    • Remote: Web APIs and cloud services

🚀 System Overview

flowchart LR
    User --> Client
    Client --> AI[AI Processing]
    Client --> Terminal[Terminal]
    Client --> Filesystem[Filesystem]
    Client --> Memory[Memory]

Core Components:

  • AI Processing: Google Gemini + LangChain for natural language understanding
  • Terminal Server: Executes system commands in isolated workspace
  • Filesystem Server: Manages file operations
  • Memory Server: Stores and retrieves persistent data

Key Features:

  • Automatic server startup as needed
  • Secure workspace isolation
  • Flexible configuration
  • Extensible architecture

📂 File Structure

flowchart TD
    A[mcp] --> B[clients]
    A --> C[servers]
    A --> D[workspace]
    
    B --> E[mcp-client]
    E --> F[main.py]
    E --> G[client.py]
    E --> H[config.json]
    E --> I[.env]
    
    C --> J[terminal]
    J --> K[server.py]
    
    D --> L[memory.json]
    D --> M[notes.txt]

Key Files:

  • clients/mcp-client/main.py: Main client entry point
  • clients/mcp-client/langchain_mcp_client_wconfig.py: AI integration
  • clients/mcp-client/theailanguage_config.json: Server configurations
  • clients/mcp-client/.env: Environment variables
  • servers/terminal_server/terminal_server.py: Terminal server
  • workspace/memory.json: Persistent memory storage
  • workspace/notes.txt: System notes

File Type Breakdown:

  • Python Files (60%):

    • Core application logic and business rules
    • Server implementations and client applications
    • Includes both synchronous and asynchronous code
    • Follows PEP 8 style guidelines
  • JSON Files (20%):

    • Configuration files for servers and services
    • API request/response schemas
    • Persistent data storage format
    • Strict schema validation enforced
  • Text Files (15%):

    • System documentation (READMEs, guides)
    • Developer notes and annotations
    • Temporary data storage
    • Plaintext logs and outputs
  • Other Formats (5%):

    • Environment files (.env)
    • Git ignore patterns
    • License information
    • Build configuration files

🔌 Client Components

flowchart TD
    A[User Input] --> B[Client]
    B --> C{Type?}
    C -->|Command| D[Terminal]
    C -->|File| E[Filesystem]
    C -->|Memory| F[Storage]
    C -->|AI| G[Gemini]
    D --> H[Response]
    E --> H
    F --> H
    G --> H
    H --> I[Output]

Main Client Files:

  • langchain_mcp_client_wconfig.py: Main client application
  • theailanguage_config.json: Server configurations
  • .env: Environment variables

Key Features:

  • Manages multiple MCP servers
  • Integrates Google Gemini for natural language processing
  • Handles dynamic response generation
  • Processes LangChain objects

Configuration:

  1. theailanguage_config.json:
{
  "mcpServers": {
    "terminal_server": {
      "command": "uv",
      "args": ["run", "../../servers/terminal_server/terminal_server.py"]
    },
    "memory": {
      "command": "npx.cmd",
      "args": ["@modelcontextprotocol/server-memory"],
      "env": {"MEMORY_FILE_PATH": "workspace/memory.json"}
    }
  }
}
  1. .env Setup:
GOOGLE_API_KEY=your_api_key_here
THEAILANGUAGE_CONFIG=clients/mcp-client/theailanguage_config.json

Setup Steps:

  1. Create .env file in clients/mcp-client/
  2. Add required variables
  3. Restart client after changes

🖥️ Server Components

classDiagram
    class TerminalServer {
        +path: String
        +run()
        +validate() 
        +execute()
    }
    TerminalServer --|> FastMCP
    class FastMCP {
        +decorate()
        +transport()
    }

Terminal Server

  • Purpose: Executes system commands in isolated workspace
  • Key Features:
    • Fast command execution
    • Secure workspace isolation
    • Comprehensive logging
  • Technical Details:
    • Uses FastMCP for transport
    • Validates commands before execution
    • Captures and returns output

Workspace Files

memory.json

  • Purpose: Persistent data storage
  • Operations:
    • Store/update/read data
    • Query specific information
  • Example Structure:
{
  "user_preferences": {
    "favorite_color": "blue",
    "interests": ["science fiction"]
  },
  "system_state": {
    "last_commands": ["git status", "ls"]
  }
}

notes.txt

  • Purpose: System documentation and notes
  • Content Types:
    • User documentation (40%)
    • System notes (30%)
    • Temporary data (20%)
    • Other (10%)

🛠️ Local Setup Guide

Prerequisites

  • Python 3.9+
  • Node.js 16+
  • Google API Key
  • UV Package Manager

Installation Steps

  1. Clone the repository:

    git clone https://github.com/Techiral/mcp.git
    cd mcp
    
  2. Set up Python environment:

    python -m venv venv
    # Linux/Mac:
    source venv/bin/activate
    # Windows:
    venv\Scripts\activate
    pip install -r requirements.txt
    
  3. Configure environment variables:

    echo "GOOGLE_API_KEY=your_key_here" > clients/mcp-client/.env
    echo "THEAILANGUAGE_CONFIG=clients/mcp-client/theailanguage_config.json" >> clients/mcp-client/.env
    
  4. Install Node.js servers:

    npm install -g @modelcontextprotocol/server-memory @modelcontextprotocol/server-filesystem
    

Verification Checklist:

  • Repository cloned
  • Python virtual environment created and activated
  • Python dependencies installed
  • .env file configured
  • Node.js servers installed

🚀 Usage Instructions

Basic Usage

  1. Start the client:
python clients/mcp-client/langchain_mcp_client_wconfig.py
  1. Type natural language requests and receive responses

Command Examples

File Operations:

Create a file named example.txt
Search for "function" in all Python files
Count lines in main.py

Web Content:

Summarize https://example.com
Extract headlines from news site

System Commands:

List files in current directory
Check Python version
Run git status

Memory Operations:

Remember my favorite color is blue
What preferences did I set?
Show recent commands

Server Configuration

Key Configuration Files:

  • theailanguage_config.json: Main server configurations
  • .env: Environment variables

Example Server Configs:

{
  "terminal_server": {
    "command": "uv",
    "args": ["run", "servers/terminal_server/terminal_server.py"]
  },
  "memory": {
    "command": "npx.cmd",
    "args": ["@modelcontextprotocol/server-memory"],
    "env": {"MEMORY_FILE_PATH": "workspace/memory.json"}
  }
}

Configuration Tips:

  • Use absolute paths for reliability
  • Set environment variables for sensitive data
  • Restart servers after configuration changes

🛠️ Troubleshooting

Common Issues & Solutions:

  1. Authentication Problems:

    • Verify Google API key in .env
    • Check key has proper permissions
    • Regenerate key if needed
  2. File Operations Failing:

    # Check permissions
    ls -la workspace/
    
    # Restart filesystem server
    npx @modelcontextprotocol/inspector uvx mcp-server-filesystem
    
  3. Memory Operations Failing:

    # Verify memory.json exists
    ls workspace/memory.json
    
    # Restart memory server
    npx @modelcontextprotocol/server-memory
    

Debugging Tools:

  • Enable verbose logging:
    echo "LOG_LEVEL=DEBUG" >> clients/mcp-client/.env
    
  • List running servers:
    npx @modelcontextprotocol/inspector list
    

Support:

🤝 How to Contribute

Getting Started:

  1. Fork and clone the repository
  2. Set up development environment (see Local Setup Guide)

Development Workflow:

# Create feature branch
git checkout -b feature/your-feature

# Make changes following:
# - Python: PEP 8 style
# - JavaScript: StandardJS style
# - Document all new functions

# Run tests
python -m pytest tests/

# Push changes
git push origin feature/your-feature

Pull Requests:

  • Reference related issues
  • Describe changes clearly
  • Include test results
  • Squash commits before merging

Code Review:

  • Reviews typically within 48 hours
  • Address all feedback before merging

Recommended Setup:

  • VSCode with Python/JS extensions
  • Docker for testing
  • Pre-commit hooks
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Tavily Mcp
CursorThe AI Code Editor
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
ChatWiseThe second fastest AI chatbot™
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Amap Maps高德地图官方 MCP Server
Playwright McpPlaywright MCP server
WindsurfThe new purpose-built IDE to harness magic
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
DeepChatYour AI Partner on Desktop
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Serper MCP ServerA Serper MCP Server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.