Sponsored by Deepsite.site

Glide to n8n MCP Adapter

Created By
mows217 months ago
MCP server that connects Glide API requests to n8n workflows using the Model Context Protocol
Content

Glide to n8n MCP Adapter

This repository contains an MCP (Model Context Protocol) server that connects Glide API requests to n8n workflows. It allows AI assistants like Claude, GPT-4, and others to trigger n8n workflows and interact with Glide apps through a standard interface.

Features

  • Bridges Glide API operations to n8n workflows
  • Provides tools for discovering and managing available n8n workflows
  • Supports executing n8n workflows with parameters
  • Offers specialized tools for common Glide operations
  • Works with both standard stdin/stdout MCP protocol and HTTP APIs
  • Includes Docker support for easy deployment
  • Integrates with Claude Desktop and other MCP clients

Overview

This adapter serves as a bridge between three components:

  1. AI Assistants (Claude, GPT-4, etc.) using the MCP protocol
  2. n8n Workflows using the Workflow Management API
  3. Glide Apps using the Glide API

It allows AI assistants to perform operations on Glide apps by leveraging n8n workflows, providing a seamless integration between these platforms.

Prerequisites

  • n8n instance with the MCP workflow running
  • The webhook URL for the n8n MCP workflow
  • Python 3.9+ (if running without Docker)
  • Docker and Docker Compose (optional, for containerized deployment)

Quick Start

  1. Clone this repository:

    git clone https://github.com/mows21/glide-n8n-mcp-adapter.git
    cd glide-n8n-mcp-adapter
    
  2. Copy .env.example to .env and fill in your n8n webhook URL:

    cp .env.example .env
    # Edit .env with your actual n8n webhook URL
    
  3. Build and run with Docker Compose:

    docker-compose up -d
    

Option 2: Manual Setup

  1. Clone this repository:

    git clone https://github.com/mows21/glide-n8n-mcp-adapter.git
    cd glide-n8n-mcp-adapter
    
  2. Create a virtual environment and install dependencies:

    python -m venv venv
    source venv/bin/activate  # On Windows, use: venv\Scripts\activate
    pip install -r requirements.txt
    
  3. Copy .env.example to .env and configure:

    cp .env.example .env
    # Edit .env with your actual n8n webhook URL
    
  4. Run the adapter:

    # For stdin/stdout mode (default MCP protocol)
    python glide_n8n_adapter.py
    
    # Or for HTTP mode
    HTTP_ADAPTER=true python http_adapter.py
    

n8n Configuration

This adapter requires an n8n instance with:

  1. An MCP server workflow using the provided n8n workflow JSON
  2. The webhook URL for the MCP server workflow (set in your .env file)
  3. Workflows for specific Glide operations (optional, can be configured in .env)

Setting up the n8n MCP Server Workflow

  1. In your n8n instance, import the workflow JSON provided in this repository
  2. Activate the workflow and note the webhook URL
  3. Update your .env file with the webhook URL

Available Tools

n8n Workflow Management

  • list_workflows: Lists all available workflows in n8n
  • search_workflows: Searches for workflows in n8n
  • add_workflow: Adds workflows to the available pool
  • remove_workflow: Removes workflows from the available pool
  • execute_workflow: Executes a workflow from the available pool

Glide API Operations

  • get_glide_rows: Get rows from a Glide table
  • create_glide_rows: Create new rows in a Glide table
  • update_glide_row: Update an existing row in a Glide table
  • delete_glide_row: Delete a row from a Glide table

Integration with AI Assistants

Claude Desktop

  1. Make sure the MCP server is running with HTTP adapter enabled:

    HTTP_ADAPTER=true
    
  2. Configure Claude Desktop to use the server:

    • Copy the claude-config.json file to your Claude Desktop configuration directory
    • Restart Claude Desktop

Other MCP Clients

For other MCP clients, use the appropriate configuration method for that client, pointing to:

  • HTTP mode: http://localhost:3000/mcp (or your deployed URL)
  • Stdin/stdout mode: The command to run the server (python glide_n8n_adapter.py)

Custom n8n Workflows for Glide

To create custom n8n workflows for Glide operations:

  1. Create a new workflow in n8n
  2. Add an "Execute Workflow Trigger" node as the first node
  3. Define the input schema to match the parameters needed for the Glide operation
  4. Implement the Glide API calls using HTTP Request nodes
  5. Tag the workflow with "mcp" so it can be discovered
  6. Deploy and activate the workflow

Environment Variables

Configure your .env file with these settings:

  • N8N_WEBHOOK_URL: The webhook URL for your n8n MCP server workflow
  • HTTP_ADAPTER: Set to "true" to enable HTTP mode, "false" for stdin/stdout mode
  • PORT: Port for HTTP adapter (default: 3000)
  • GLIDE_GET_ROWS_WORKFLOW_ID: ID of the workflow for getting rows (optional)
  • GLIDE_CREATE_ROWS_WORKFLOW_ID: ID of the workflow for creating rows (optional)
  • GLIDE_UPDATE_ROW_WORKFLOW_ID: ID of the workflow for updating rows (optional)
  • GLIDE_DELETE_ROW_WORKFLOW_ID: ID of the workflow for deleting rows (optional)

Troubleshooting

Common issues:

  • Connection errors: Check that your n8n instance is running and the webhook URL is correct
  • Missing workflows: Ensure workflows are properly tagged and active in n8n
  • Parameter errors: Verify that the workflow input schema matches the expected parameters
  • HTTP adapter issues: Check that Flask is running on the correct port and the endpoint is accessible

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT

Acknowledgements

This project was inspired by:

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
DeepChatYour AI Partner on Desktop
Tavily Mcp
CursorThe AI Code Editor
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Serper MCP ServerA Serper MCP Server
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Playwright McpPlaywright MCP server
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Amap Maps高德地图官方 MCP Server
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
ChatWiseThe second fastest AI chatbot™
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
WindsurfThe new purpose-built IDE to harness magic