Sponsored by Deepsite.site

Aider MCP WebSocket Server

Created By
larock226 months ago
WebSocket server that exposes Aider's AI coding capabilities through the Model Context Protocol (MCP) for programmatic access
Content

Aider MCP WebSocket Server

Quick Start

This is a real MCP (Model Context Protocol) server that lets you control Aider programmatically via WebSocket. It's like having Aider as an API.

# 1. Install
npm install
cp .env.example .env
# Add your OPENAI_API_KEY=sk-... to .env

# 2. Run the MCP server (recommended)
npm run mcp

# 3. Test it
python3 test-debug.py

What you get: Send natural language commands to Aider, get back created/edited files and full responses. Each client gets isolated workspaces. Works with any MCP-compatible client (Claude Desktop, custom apps, etc.).

Overview

A production-ready MCP (Model Context Protocol) wrapper that exposes Aider's functionality over WebSocket, enabling editor plugins and services to interact with Aider programmatically. This project provides two server implementations:

  1. src/mcp-server.ts - Real MCP Protocol Server (Recommended)

    • Implements standard JSON-RPC 2.0 MCP protocol
    • Proper MCP methods: initialize, tools/list, tools/call
    • Smart completion detection (waits for Aider to finish)
    • Compatible with Claude Desktop and other MCP clients
  2. src/index.ts - Custom WebSocket Wrapper

    • Custom protocol with hello/welcome handshake
    • Direct stdio pipe to Aider process
    • Token-based authentication

Architecture

┌─────────────────┐         JSON-RPC 2.0        ┌─────────────────┐
│   MCP Client    │ ◄─────────────────────────► │   MCP Server    │
│  (Any MCP app)  │         WebSocket           │  (mcp-server.ts)│
└─────────────────┘                             └────────┬────────┘
                                                         │ spawn()
                                                ┌─────────────────┐
                                                │  Aider Process  │
                                                │  (CLI instance) │
                                                └─────────────────┘

Features

  • Full MCP Protocol Support: Standard MCP methods and JSON-RPC 2.0
  • Multi-tenant Isolation: One Aider child process per client
  • Automatic Completion Detection: Waits for Aider to complete tasks
  • Isolated Workspaces: Each session gets its own workspace directory
  • OpenAI API Integration: Works with your existing OpenAI API key
  • Real-time Communication: WebSocket-based bidirectional communication
  • Flexible Configuration: Environment variables and CLI arguments
  • Structured Logging: JSON logging with Pino

Installation

# Clone and install
git clone <repo-url>
cd aider_mcp
npm install

# Configure environment
cp .env.example .env
# Edit .env and add your OpenAI API key:
# OPENAI_API_KEY=sk-...

Prerequisites

  • Node.js 18+
  • Aider installed and available in PATH
  • OpenAI API key

Usage

Running the Servers

# Run MCP protocol server (recommended)
npm run mcp

# Run custom WebSocket wrapper
npm run dev

# Production build
npm run build && npm start

Testing

# Quick test of MCP server
python3 test-debug.py

# Full test with waiting
python3 test-mcp-wait.py

# JavaScript test client
node test-mcp-client.js

MCP Protocol Communication

Available Tools

The MCP server exposes these tools to clients:

  1. aider_command - Execute natural language commands

    {
      "name": "aider_command",
      "arguments": {
        "command": "create a REST API with Flask"
      }
    }
    
  2. aider_add_file - Add files to Aider's context

    {
      "name": "aider_add_file", 
      "arguments": {
        "filename": "app.py"
      }
    }
    
  3. aider_run_command - Run shell commands through Aider

    {
      "name": "aider_run_command",
      "arguments": {
        "command": "npm test"
      }
    }
    

Connection Flow

// 1. Connect via WebSocket
const ws = new WebSocket('ws://localhost:8080');

// 2. Initialize MCP session
ws.send(JSON.stringify({
  "jsonrpc": "2.0",
  "method": "initialize",
  "params": {
    "protocolVersion": "0.1.0",
    "capabilities": {},
    "clientInfo": {
      "name": "your-client",
      "version": "1.0.0"
    }
  },
  "id": 1
}));

// 3. Call tools after initialization
ws.send(JSON.stringify({
  "jsonrpc": "2.0", 
  "method": "tools/call",
  "params": {
    "name": "aider_command",
    "arguments": {
      "command": "create a Python hello world script"
    }
  },
  "id": 2
}));

Configuration

Environment Variables

# Required
OPENAI_API_KEY=sk-...        # Your OpenAI API key

# Optional
PORT=8080                    # WebSocket server port (MCP server default)
PORT=7011                    # WebSocket server port (custom wrapper default)
AIDER_MODEL=gpt-4o-mini     # Default model
WORKSPACES_DIR=./workspaces # Where to create project directories
LOG_LEVEL=info              # Logging verbosity (debug|info|warn|error)
VALID_TOKENS=token1,token2  # Auth tokens for custom wrapper (empty = no auth)

CLI Arguments

Override configuration at runtime:

# Custom port and workspace
npm run mcp -- --port 9000
npm run dev -- --port 8080 --workspaces ./custom-workspaces

# Override environment variables (custom wrapper only)
npm run dev -- --set AIDER_MODEL=gpt-4 --set LOG_LEVEL=debug

Integration Examples

Claude Desktop App

Add to Claude Desktop config:

{
  "mcpServers": {
    "aider": {
      "command": "node",
      "args": ["/path/to/aider_mcp/dist/mcp-server.js"],
      "env": {
        "OPENAI_API_KEY": "your-key",
        "AIDER_MODEL": "gpt-4o-mini"
      }
    }
  }
}

Custom MCP Client

import { WebSocket } from 'ws';

class AiderMCPClient {
  constructor(url = 'ws://localhost:8080') {
    this.ws = new WebSocket(url);
    this.requestId = 1;
    this.setup();
  }

  async callTool(toolName, args) {
    return this.sendRequest('tools/call', {
      name: toolName,
      arguments: args
    });
  }

  sendRequest(method, params) {
    const id = this.requestId++;
    this.ws.send(JSON.stringify({
      jsonrpc: '2.0',
      method,
      params,
      id
    }));
  }
}

Project Structure

aider_mcp/
├── src/
│   ├── index.ts              # Custom WebSocket wrapper
│   └── mcp-server.ts         # Real MCP protocol server
├── workspaces/               # Auto-generated client workspaces  
├── test-*.py/js              # Various test clients
├── package.json              # Node.js dependencies
├── tsconfig.json             # TypeScript configuration
├── .env                      # Environment configuration
└── README.md                 # This file

Scripts

  • npm run mcp - Start MCP protocol server (recommended)
  • npm run dev - Start custom WebSocket wrapper with hot reload
  • npm run build - Compile TypeScript to JavaScript
  • npm start - Start production server
  • npm run lint - Run ESLint
  • npm run typecheck - Run TypeScript type checking

Troubleshooting

Common Issues

  1. "ConnectionRefusedError" or "Connect call failed"

    • Server isn't running. Start it with npm run mcp
    • Wrong port. Check .env file (MCP default: 8080, Custom default: 7011)
  2. "Aider not found"

    # Install Aider
    pip install aider-chat
    
    # Verify installation
    which aider
    
  3. "No API key" or Aider errors

    • Add your OpenAI API key to .env:
    OPENAI_API_KEY=sk-proj-...
    
  4. Server returns too quickly / incomplete responses

    • Fixed in MCP server - waits for Aider to complete (3-second silence detection)
  5. "Tool execution timeout"

    • Normal for complex tasks
    • The MCP server waits as long as needed

Debug Mode

# Run with debug logging
LOG_LEVEL=debug npm run mcp

Checking Results

Files created by Aider are stored in workspace directories:

# List all workspaces
ls -la workspaces/

# Check latest workspace  
ls -la workspaces/mcp-*/

# View created files
cat workspaces/mcp-*/your-file.py

Security Considerations

  • Isolated Workspaces: Each client session gets its own directory
  • Authentication: Custom wrapper supports token-based auth via VALID_TOKENS
  • Git Disabled: Runs with --no-git by default for safety
  • API Key Security: Never commit .env file
  • Production: Consider running in Docker for additional isolation

Performance Notes

  • First request to Aider may take longer (model loading)
  • Subsequent requests are faster
  • The server maintains one Aider process per session
  • Workspace cleanup is manual (delete old directories as needed)

License

MIT

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Amap Maps高德地图官方 MCP Server
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
WindsurfThe new purpose-built IDE to harness magic
ChatWiseThe second fastest AI chatbot™
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Tavily Mcp
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Playwright McpPlaywright MCP server
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
DeepChatYour AI Partner on Desktop
Serper MCP ServerA Serper MCP Server
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
CursorThe AI Code Editor
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"