Sponsored by Deepsite.site

Enterprise Model Context Protocol (MCP) Server & Client

Created By
sanjay-si8 months ago
MCP Client & Server
Content

📖 Full Tutorial & Code Walkthrough

Enterprise Model Context Protocol (MCP) Server & Client

A comprehensive enterprise-grade implementation of the Model Context Protocol (MCP) for connecting LLMs with enterprise tools and data sources.

💰 Cost-Optimized for Production

  • Zero-cost testing: Works without any LLM API keys
  • Optimized models: Uses gpt-4o-mini and claude-3.5-haiku (97% cheaper than premium models)
  • Single API key: Choose either OpenAI OR Anthropic (not both)
  • Direct tool access: Enterprise functionality without LLM overhead

🚀 Features

MCP Server

  • JSON-RPC 2.0 Protocol: Full MCP specification compliance
  • Enterprise Tools: Database queries, file operations, API integrations
  • Security: JWT authentication, role-based access control
  • Monitoring: Prometheus metrics, structured logging
  • WebSocket Support: Real-time bidirectional communication

MCP Client

  • Multi-LLM Support: OpenAI GPT and Anthropic Claude integration
  • Tool Discovery: Automatic detection of server capabilities
  • Async Operations: High-performance async/await architecture
  • Error Handling: Robust error handling and retry mechanisms

Enterprise Tools

  • Database Tool: Secure SQL query execution with injection protection
  • File Tool: File system operations with sandboxed access
  • API Tool: HTTP client for external API integrations

📦 Installation

Prerequisites

  • Python 3.11+
  • PostgreSQL (optional, for enterprise database features)
  • Redis (optional, for caching and session management)

Quick Start with Docker

# Clone the repository
git clone <repository-url>
cd enterprise_mcp

# Start all services
docker-compose up -d

# The MCP server will be available at ws://localhost:8000/mcp

Manual Installation

Server Setup

cd mcp_server
pip install -r requirements.txt

# Set up environment variables
cp ../.env.example .env
# Edit .env with your configuration

# Run the server
python -m src.mcp_server.main

Client Setup

cd mcp_client
pip install -r requirements.txt

# Set environment variables for LLM APIs (OPTIONAL - choose one or none)
export OPENAI_API_KEY="your-openai-key"      # OR
export ANTHROPIC_API_KEY="your-anthropic-key" # OR neither for zero-cost testing

# Run the demo (with LLM)
python examples/demo.py

# OR run zero-cost demo (no API keys needed)
python examples/demo_no_llm.py

🔧 Configuration

Environment Variables

# Server Configuration
MCP_SERVER_HOST=localhost
MCP_SERVER_PORT=8000
MCP_SERVER_SECRET_KEY=your-secret-key-here

# Database Configuration
DATABASE_URL=postgresql://user:password@localhost:5432/mcp_enterprise
REDIS_URL=redis://localhost:6379

# Authentication
JWT_SECRET_KEY=your-jwt-secret-key
JWT_ALGORITHM=HS256
JWT_EXPIRATION_HOURS=24

# LLM Configuration (Optional - Cost Optimized)
# Choose ONE API key for cost efficiency:
OPENAI_API_KEY=your-openai-key           # Uses gpt-4o-mini ($0.15/1M tokens)
# OR
ANTHROPIC_API_KEY=your-anthropic-key     # Uses claude-3.5-haiku ($1/1M tokens)

# Enterprise Tools
ENTERPRISE_DB_URL=postgresql://user:password@localhost:5432/enterprise_data
ENTERPRISE_API_BASE_URL=https://api.enterprise.com
ENTERPRISE_API_KEY=your-enterprise-api-key

🎯 Usage Examples

Basic MCP Client Usage

import asyncio
from src.mcp_client.core.client import MCPClient
from src.mcp_client.llm.openai_integration import OpenAIWithMCP

async def main():
    # Initialize MCP client
    mcp_client = MCPClient(
        server_url="ws://localhost:8000/mcp",
        client_info={"name": "My App", "version": "1.0.0"}
    )
    
    # Connect to server
    await mcp_client.connect()
    
    # Initialize LLM with MCP tools
    llm = OpenAIWithMCP(
        api_key="your-openai-key",
        mcp_client=mcp_client
    )
    
    # Use AI with enterprise tools
    messages = [{
        "role": "user",
        "content": "Query the database for user analytics and create a report file"
    }]
    
    result = await llm.chat_completion_with_tools(messages)
    print(result['response'])
    
    await mcp_client.disconnect()

asyncio.run(main())

Database Operations

# Through MCP client
result = await mcp_client.call_tool("database_query", {
    "query": "SELECT COUNT(*) FROM users WHERE created_at > ?",
    "parameters": ["2024-01-01"],
    "limit": 100
})

File Operations

# Create and read files
await mcp_client.call_tool("file_operations", {
    "operation": "write",
    "path": "/tmp/mcp_workspace/report.txt",
    "content": "Enterprise report data..."
})

API Integration

# Make HTTP requests
await mcp_client.call_tool("api_request", {
    "method": "GET",
    "url": "https://api.enterprise.com/metrics",
    "headers": {"Authorization": "Bearer token"}
})

🏗️ Architecture

┌─────────────────┐    ┌─────────────────┐    ┌─────────────────┐
│   LLM Client    │    │   MCP Client    │    │   MCP Server    │
│  (OpenAI/Claude)│◄──►│   (WebSocket)   │◄──►│   (FastAPI)     │
└─────────────────┘    └─────────────────┘    └─────────────────┘
                                               ┌─────────────────┐
                                               │ Enterprise Tools │
                                               │ • Database      │
                                               │ • File System   │
                                               │ • APIs          │
                                               └─────────────────┘

🔒 Security Features

  • JWT Authentication: Secure token-based authentication
  • Role-Based Access Control: Fine-grained permissions
  • SQL Injection Protection: Query validation and parameterization
  • Sandboxed File Access: Restricted file system operations
  • HTTPS/WSS Support: Encrypted communications
  • Audit Logging: Comprehensive security event logging

📊 Monitoring

Prometheus Metrics

  • mcp_requests_total: Total MCP requests by method and status
  • mcp_request_duration_seconds: Request duration histogram
  • mcp_active_connections: Current active connections
  • mcp_tool_calls_total: Tool execution counts
  • mcp_tool_execution_duration: Tool execution time

Health Endpoints

  • GET /health: Basic health check
  • GET /metrics: Prometheus metrics endpoint

🧪 Testing

# Run server tests
cd mcp_server
pytest tests/

# Run client tests
cd mcp_client
pytest tests/

# Run integration tests
pytest tests/integration/

🚀 Deployment

Production Docker Setup

# Build and deploy
docker-compose -f docker-compose.prod.yml up -d

# Scale the server
docker-compose -f docker-compose.prod.yml up -d --scale mcp-server=3

Kubernetes Deployment

# Apply Kubernetes manifests
kubectl apply -f k8s/

# Check deployment status
kubectl get pods -l app=mcp-server

📈 Performance

  • Concurrent Connections: Supports 1000+ simultaneous WebSocket connections
  • Request Throughput: 10,000+ requests per second
  • Tool Execution: Sub-100ms response time for database queries
  • Memory Usage: <512MB per server instance

🛣️ Roadmap

  • Redis caching for improved performance
  • GraphQL API integration tool
  • Enterprise SSO integration
  • Multi-tenant support
  • Advanced audit and compliance features
  • Horizontal auto-scaling

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🆘 Support

🙏 Acknowledgments

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Tavily Mcp
Amap Maps高德地图官方 MCP Server
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Playwright McpPlaywright MCP server
WindsurfThe new purpose-built IDE to harness magic
ChatWiseThe second fastest AI chatbot™
CursorThe AI Code Editor
Serper MCP ServerA Serper MCP Server
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
DeepChatYour AI Partner on Desktop
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.