Sponsored by Deepsite.site

Fledge MCP Server

Created By
Krupalp5259 months ago
Fledge Model Context Protocol (MCP) Server for Cursor AI integration
Content

Fledge MCP Server

This is a Model Context Protocol (MCP) server that connects Fledge functionality to Cursor AI, allowing the AI to interact with Fledge instances via natural language commands.

Prerequisites

  • Fledge installed locally or accessible via API (default: http://localhost:8081)
  • Cursor AI installed
  • Python 3.8+

Installation

  1. Clone this repository:
git clone https://github.com/Krupalp525/fledge-mcp.git
cd fledge-mcp
  1. Install the dependencies:
pip install -r requirements.txt

Running the Server

  1. Make sure Fledge is running:
fledge start
  1. Start the MCP server:
python mcp_server.py

For secure operation with API key authentication:

python secure_mcp_server.py
  1. Verify it's working by accessing the health endpoint:
curl http://localhost:8082/health

You should receive "Fledge MCP Server is running" as the response.

Connecting to Cursor

  1. In Cursor, go to Settings > MCP Servers

  2. Add a new server:

  3. For the secure server, configure the "X-API-Key" header with the value from the api_key.txt file that is generated when the secure server starts.

  4. Test it: Open Cursor's Composer (Ctrl+I), type "Check if Fledge API is reachable," and the AI should call the validate_api_connection tool.

Available Tools

Data Access and Management

  1. get_sensor_data: Fetch sensor data from Fledge with optional filtering by time range and limit
  2. list_sensors: List all sensors available in Fledge
  3. ingest_test_data: Ingest test data into Fledge, with optional batch count

Service Control

  1. get_service_status: Get the status of all Fledge services
  2. start_stop_service: Start or stop a Fledge service by type
  3. update_config: Update Fledge configuration parameters

Frontend Code Generation

  1. generate_ui_component: Generate React components for Fledge data visualization
  2. fetch_sample_frontend: Get sample frontend templates for different frameworks
  3. suggest_ui_improvements: Get AI-powered suggestions for improving UI code

Real-Time Data Streaming

  1. subscribe_to_sensor: Set up a subscription to sensor data updates
  2. get_latest_reading: Get the most recent reading from a specific sensor

Debugging and Validation

  1. validate_api_connection: Check if the Fledge API is reachable
  2. simulate_frontend_request: Test API requests with different methods and payloads

Documentation and Schema

  1. get_api_schema: Get information about available Fledge API endpoints
  2. list_plugins: List available Fledge plugins

Advanced AI-Assisted Features

  1. generate_mock_data: Generate realistic mock sensor data for testing

Testing the API

You can test the server using the included test scripts:

# For standard server
python test_mcp.py

# For secure server with API key
python test_secure_mcp.py

Security Options

The secure server (secure_mcp_server.py) adds API key authentication:

  1. On first run, it generates an API key stored in api_key.txt
  2. All requests must include this key in the X-API-Key header
  3. Health check endpoint remains accessible without authentication

Example API Requests

# Validate API connection
curl -X POST -H "Content-Type: application/json" -d '{"name": "validate_api_connection"}' http://localhost:8082/tools

# Generate mock data
curl -X POST -H "Content-Type: application/json" -d '{"name": "generate_mock_data", "parameters": {"sensor_id": "temp1", "count": 5}}' http://localhost:8082/tools

# Generate React chart component
curl -X POST -H "Content-Type: application/json" -d '{"name": "generate_ui_component", "parameters": {"component_type": "chart", "sensor_id": "temp1"}}' http://localhost:8082/tools

# For secure server, add API key header
curl -X POST -H "Content-Type: application/json" -H "X-API-Key: YOUR_API_KEY" -d '{"name": "list_sensors"}' http://localhost:8082/tools

Extending the Server

To add more tools:

  1. Add the tool definition to tools.json
  2. Implement the tool handler in mcp_server.py and secure_mcp_server.py

Production Considerations

For production deployment:

  • Use HTTPS
  • Deploy behind a reverse proxy like Nginx
  • Implement more robust authentication (JWT, OAuth)
  • Add rate limiting
  • Set up persistent data storage for subscriptions

Deploying on Smithery.ai

The Fledge MCP Server can be deployed on Smithery.ai for enhanced scalability and availability. Follow these steps to deploy:

  1. Prerequisites

    • Docker installed on your local machine
    • A Smithery.ai account
    • The Smithery CLI tool installed
  2. Build and Deploy

    # Build the Docker image
    docker build -t fledge-mcp .
    
    # Deploy to Smithery.ai
    smithery deploy
    
  3. Configuration The smithery.json file contains the configuration for your deployment:

    • WebSocket transport on port 8082
    • Configurable Fledge API URL
    • Tool definitions and parameters
    • Timeout settings
  4. Environment Variables Set the following environment variables in your Smithery.ai dashboard:

    • FLEDGE_API_URL: Your Fledge API endpoint
    • API_KEY: Your secure API key (if using secure mode)
  5. Verification After deployment, verify your server is running:

    smithery status fledge-mcp
    
  6. Monitoring Monitor your deployment through the Smithery.ai dashboard:

    • Real-time logs
    • Performance metrics
    • Error tracking
    • Resource usage
  7. Updating To update your deployment:

    # Build new image
    docker build -t fledge-mcp .
    
    # Deploy updates
    smithery deploy --update
    

JSON-RPC Protocol Support

The server implements the Model Context Protocol (MCP) using JSON-RPC 2.0 over WebSocket. The following methods are supported:

  1. initialize

    {
        "jsonrpc": "2.0",
        "method": "initialize",
        "params": {},
        "id": "1"
    }
    

    Response:

    {
        "jsonrpc": "2.0",
        "result": {
            "serverInfo": {
                "name": "fledge-mcp",
                "version": "1.0.0",
                "description": "Fledge Model Context Protocol (MCP) Server",
                "vendor": "Fledge",
                "capabilities": {
                    "tools": true,
                    "streaming": true,
                    "authentication": "api_key"
                }
            },
            "configSchema": {
                "type": "object",
                "properties": {
                    "fledge_api_url": {
                        "type": "string",
                        "description": "Fledge API URL",
                        "default": "http://localhost:8081/fledge"
                    }
                }
            }
        },
        "id": "1"
    }
    
  2. tools/list

    {
        "jsonrpc": "2.0",
        "method": "tools/list",
        "params": {},
        "id": "2"
    }
    

    Response: Returns the list of available tools and their parameters.

  3. tools/call

    {
        "jsonrpc": "2.0",
        "method": "tools/call",
        "params": {
            "name": "get_sensor_data",
            "parameters": {
                "sensor_id": "temp1",
                "limit": 10
            }
        },
        "id": "3"
    }
    

Error Codes

The server follows standard JSON-RPC 2.0 error codes:

  • -32700: Parse error
  • -32600: Invalid Request
  • -32601: Method not found
  • -32602: Invalid params
  • -32000: Server error
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
ChatWiseThe second fastest AI chatbot™
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Serper MCP ServerA Serper MCP Server
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
WindsurfThe new purpose-built IDE to harness magic
Tavily Mcp
Amap Maps高德地图官方 MCP Server
Playwright McpPlaywright MCP server
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
CursorThe AI Code Editor
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
DeepChatYour AI Partner on Desktop
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.