Sponsored by Deepsite.site

Model Context Protocol (MCP) Server

Created By
shaswata567 months ago
Content

Model Context Protocol (MCP) Server

A modular Model Context Protocol server for AI services with multiple transport options and dynamic service selection. Built with SOLID principles for maintainability and extensibility.

Features

  • Multiple AI Services: Support for Claude, OpenAI, and mock services
  • Dynamic Service Selection: Choose AI service on a per-request basis
  • Multiple Transports:
    • stdio: For command-line usage and scripting
    • TCP: For network-based applications
    • WebSocket: For web browsers and real-time applications
  • JSON-RPC 2.0: Compliant interface for predictable interactions
  • Modular Architecture: Easy to extend with new services and transports
  • Environment Configuration: Simple setup via .env file
  • Streaming Support: Real-time response streaming for supported transports

Repository Structure

basic-mcp-server/
├── .env                      # Environment configuration
├── .gitignore                # Git ignore rules
├── README.md                 # Project documentation
├── examples/                 # Example clients
│   ├── example_client.py     # Command-line client example
│   └── websocket_client.html # Browser WebSocket client
├── mcp_server.py             # Main entry point
└── mcp_server/               # Core package
    ├── config/               # Configuration management
    │   ├── settings.py       # Environment and settings handling
    │   └── __init__.py
    ├── core/                 # Core server logic
    │   ├── server.py         # Main server implementation
    │   └── __init__.py
    ├── handlers/             # Method handlers
    │   ├── base_handlers.py  # Standard MCP handlers
    │   ├── system_handlers.py # System info handlers
    │   └── __init__.py
    ├── models/               # Data models
    │   ├── json_rpc.py       # JSON-RPC data structures
    │   └── __init__.py
    ├── services/             # AI service implementations
    │   ├── claude_service.py # Anthropic Claude API
    │   ├── openai_service.py # OpenAI API
    │   └── __init__.py       # Service registry
    ├── transports/           # Communication protocols
    │   ├── base.py           # Transport interfaces
    │   ├── websocket.py      # WebSocket implementation
    │   └── __init__.py
    └── __init__.py

Installation

  1. Clone the repository:

    git clone https://github.com/shaswata56/basic-mcp-server.git
    cd basic-mcp-server
    
  2. Create a virtual environment and install dependencies:

    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
    pip install -e .
    
  3. Configure your environment by editing the .env file with your API keys and settings.

Configuration

Environment Variables

The server can be configured using environment variables in the .env file:

VariableDescriptionDefault
AI_SERVICE_TYPEDefault AI service to use ("claude", "openai", "mock")"claude"
SECRETS_FILEPath to JSON file with API secretsNone
ANTHROPIC_API_KEYYour Anthropic API keyNone
OPENAI_API_KEYYour OpenAI API keyNone
MCP_SERVER_NAMEName of the server"ai-mcp-server"
MCP_SERVER_VERSIONServer version"1.0.0"
MCP_TRANSPORT_TYPETransport type ("stdio", "tcp", or "websocket")"stdio"
MCP_TCP_HOSTTCP/WebSocket host address"127.0.0.1"
MCP_TCP_PORTTCP server port9000
MCP_WS_PORTWebSocket server port8765
MCP_WS_PATHWebSocket server path"/"
MCP_WS_ORIGINSComma-separated list of allowed originsNone (all allowed)
CLAUDE_DEFAULT_MODELDefault Claude model"claude-3-opus-20240229"
CLAUDE_DEFAULT_MAX_TOKENSDefault max tokens for Claude4096
CLAUDE_DEFAULT_TEMPERATUREDefault temperature for Claude0.7
OPENAI_DEFAULT_MODELDefault OpenAI model"gpt-4o"
OPENAI_DEFAULT_MAX_TOKENSDefault max tokens for OpenAI1024
OPENAI_DEFAULT_TEMPERATUREDefault temperature for OpenAI0.7

| EMBEDDINGS_3_LARGE_API_URL | Azure endpoint for text-embedding-3-large | None | | EMBEDDINGS_3_LARGE_API_KEY | API key for text-embedding-3-large | None | | EMBEDDINGS_3_SMALL_API_URL | Azure endpoint for text-embedding-3-small | None | | EMBEDDINGS_3_SMALL_API_KEY | API key for text-embedding-3-small | None | | AZURE_OPENAI_EMBEDDING_DEPLOYMENT | Azure deployment name for embeddings | <model name> | | QDRANT_URL | URL of the Qdrant server (use :memory: for in-memory) | None | | QDRANT_API_KEY | API key for Qdrant Cloud | None |

When embedding API credentials are not provided, the server will generate deterministic mock embeddings so that testing can proceed without external services.

For production deployments, configure QDRANT_URL to point to a dedicated Qdrant server. Using a remote server provides persistent storage and improved vector search performance compared to the default in-memory mode.

The optional SECRETS_FILE variable allows you to store API keys in a JSON file instead of environment variables. Values defined in the secrets file are used when corresponding environment variables are not set. If a secret value is an array, the server will rotate through the values each time the key is requested, enabling simple key rotation strategies.

Usage

Running the Server

Standard stdio Mode

python mcp_server.py

TCP Server Mode

python mcp_server.py --tcp --host 127.0.0.1 --port 9000

WebSocket Server Mode

python mcp_server.py --websocket --host 127.0.0.1 --port 8765 --ws-path /

Command Line Options

usage: mcp_server.py [-h] [--tcp | --websocket] [--host HOST] [--port PORT]
                     [--ws-path WS_PATH] [--service-type {claude,openai,mock}]
                     [--claude-api-key CLAUDE_API_KEY]
                     [--openai-api-key OPENAI_API_KEY]
                     [--qdrant-url QDRANT_URL]
                     [--qdrant-api-key QDRANT_API_KEY] [--mock]
                     [--log-level {DEBUG,INFO,WARNING,ERROR}]
                     [--env-file ENV_FILE]

AI MCP Server with JSON-RPC

options:
  -h, --help            show this help message and exit
  --log-level {DEBUG,INFO,WARNING,ERROR}
                        Logging level
  --env-file ENV_FILE   Path to .env file (default: .env in project root)

Transport Options:
  --tcp                 Run as TCP server
  --websocket           Run as WebSocket server
  --host HOST           Host to bind server
  --port PORT           Port for server
  --ws-path WS_PATH     URL path for WebSocket server (default: /)

AI Service Options:
  --service-type {claude,openai,mock}
                        AI service to use
  --claude-api-key CLAUDE_API_KEY
                        Anthropic API key
  --openai-api-key OPENAI_API_KEY
                          OpenAI API key
  --qdrant-url QDRANT_URL
                          Qdrant server URL
  --qdrant-api-key QDRANT_API_KEY
                          Qdrant API key
  --mock                Use mock AI service (for testing)

Client Examples

Command Line Client

The examples/example_client.py provides a simple way to interact with the server:

# Initialize connection
python examples/example_client.py initialize

# List available tools
python examples/example_client.py list-tools

# Echo text
python examples/example_client.py echo "Hello, world!"

# Calculate expression
python examples/example_client.py calculate "2 + 3 * 4"

# Ask AI with dynamic service selection
python examples/example_client.py ask "What is the capital of France?" --service claude

# System information
python examples/example_client.py system-info

WebSocket Browser Client

For WebSocket transport, open examples/websocket_client.html in a browser:

  1. Enter the WebSocket URL (e.g., ws://localhost:8765/)
  2. Click "Connect"
  3. Use the interface to send requests to the server

JSON-RPC Interface

Unified AI Message Request

{
  "jsonrpc": "2.0",
  "method": "tools/call",
  "params": {
    "name": "ai/message",
    "arguments": {
      "prompt": "What is the capital of France?",
      "service_name": "claude"  // Optional: "claude", "openai", "mock" or omit for default
    }
  },
  "id": 1
}

Service-Specific Requests (for backward compatibility)

Claude:

{
  "jsonrpc": "2.0",
  "method": "tools/call",
  "params": {
    "name": "claude/message",
    "arguments": {
      "prompt": "What is the capital of France?"
    }
  },
  "id": 2
}

OpenAI:

{
  "jsonrpc": "2.0",
  "method": "tools/call",
  "params": {
    "name": "openai/message",
    "arguments": {
      "prompt": "What is the capital of France?"
    }
  },
  "id": 3
}

Available Methods

MethodDescription
initializeInitialize the server connection
tools/listList available tools
tools/callCall a tool with arguments
resources/listList available resources
resources/readRead a resource
system/infoGet system information
system/healthCheck system health

Extending the Server

Adding a New Method Handler

  1. Create a new handler class implementing the HandlerInterface in the handlers directory
  2. Register it in the AIMCPServerApp.initialize() method

Adding a New AI Service

  1. Create a new service class implementing the AIServiceInterface in the services directory
  2. Add it to the service registry in create_ai_services_from_config()

Adding a New Transport

  1. Create a new transport class extending the Transport class in the transports directory
  2. Update the main function to use your new transport

WebSocket and Load Balancers

When using a TLS/SSL-terminating load balancer (like AWS ELB) in front of this server:

  • Clients connect to the load balancer using secure WebSockets (wss://)
  • The load balancer handles TLS/SSL termination
  • The load balancer forwards traffic to the MCP server using regular WebSockets (ws://)
  • No need to implement WSS in the application itself

Performance Considerations

Processing very large repositories, especially C# projects, can generate many database operations. Consider batching inserts or using alternative storage strategies if you encounter performance issues with MongoDB.

License

MIT License

Acknowledgements

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
ChatWiseThe second fastest AI chatbot™
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Playwright McpPlaywright MCP server
Tavily Mcp
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
CursorThe AI Code Editor
WindsurfThe new purpose-built IDE to harness magic
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Serper MCP ServerA Serper MCP Server
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
DeepChatYour AI Partner on Desktop
Amap Maps高德地图官方 MCP Server