Sponsored by Deepsite.site

Aider MCP Server

Created By
danielscholl8 months ago
An experimental MCP server to use aider as a coding agent.
Content

Aider MCP Server

A Machine Cognition Protocol (MCP) server that provides AI coding capabilities using Aider.

Status: Beta Python: 3.10+

Table of Contents

Overview

This MCP server leverages Aider, a powerful AI coding assistant, to provide coding capabilities via a standardized API. By discretely offloading work to Aider, we can reduce costs while using model-specific capabilities to create more reliable code through multiple focused LLM calls.

Features

  • AI Code Generation: Run one-shot coding tasks with Aider to add, fix, or enhance code
  • Model Selection: Query available models to choose the most appropriate one for your task
  • Flexible Configuration: Configure Aider sessions with customizable settings
  • Multi-transport Support: Run via Server-Sent Events (SSE) or stdio for flexible integration

Prerequisites

  • Python: 3.10 or higher
  • Package Manager: uv (recommended) or pip
  • API Keys: Depending on the models you want to use, you'll need API keys for:
    • OpenAI (for GPT models)
    • Anthropic (for Claude models)
    • Google (for Gemini models)

Installation

  1. Clone the repository:

    git clone https://github.com/your-username/aider-mcp.git
    cd aider-mcp
    
  2. Install the package:

    uv venv
    uv pip install -e .
    
  3. Run the tests to ensure everything is working:

    uv run pytest
    

Configuration

Configure the server behavior using environment variables in a .env file:

# Create environment file from example
cp .env.example .env

Edit the .env file to configure transport, host, port, and API keys.

VariableDescriptionDefaultRequired
TRANSPORTTransport protocol (sse or stdio)sseNo
HOSTHost to bind to when using SSE transport0.0.0.0No
PORTPort to listen on when using SSE transport8050No
OPENAI_API_KEYAPI key for OpenAI models*
ANTHROPIC_API_KEYAPI key for Anthropic models*
GEMINI_API_KEYAPI key for Google Gemini models*

*Required only if using models from that provider

Command Line Options

  • --editor-model: Model to use for editing (default: gemini/gemini-2.5-pro-exp-03-25)
  • --architect-model: Model to use for architecture planning (optional)
  • --cwd: Current working directory (default: current directory)

Running the Server

Start the Server (SSE Mode)

# Using the module directly
uv run python -m aider_mcp_server

Or with custom settings:

uv run python -m aider_mcp_server --editor-model "gemini/gemini-2.5-pro-exp-03-25" --cwd "/path/to/project"

You should see output similar to:

Starting server with transport: sse
Using SSE transport on 0.0.0.0:8050

Using stdio Mode

When using stdio mode, you don't need to start the server separately - the MCP client will start it automatically when configured properly (see Integration with MCP Clients).

MCP Tools

The Aider MCP server exposes the following tools:

ai_code

Run Aider to perform coding tasks.

Parameters:

  • ai_coding_prompt: The prompt for the AI coding task
  • relative_editable_files: List of files that can be edited
  • relative_readonly_files: (Optional) List of files that should be read-only
  • settings: (Optional) Settings for the Aider session

Example:

{
  "ai_coding_prompt": "Add a function that calculates the factorial of a number",
  "relative_editable_files": ["math.py"],
  "settings": {
    "auto_commits": false,
    "use_git": false
  }
}

get_models

List available Aider models filtered by substring.

Parameters:

  • substring: Substring to filter models by

Example:

{
  "substring": "openai"
}

Integration with MCP Clients

Configure your MCP client to connect to the SSE endpoint:

{
  "mcpServers": {
    "aider-mcp-server": {
      "transport": "sse",
      "serverUrl": "http://localhost:8050/sse"
    }
  }
}

Stdio Integration

Configure your MCP client to run the server via stdio:

{
  "mcpServers": {
    "aider-mcp-server": {
      "transport": "stdio",
      "command": "python",
      "args": ["-m", "aider_mcp_server"],
      "env": {
        "TRANSPORT": "stdio"
      }
    }
  }
}

Integration with MCP Clients

Using Docker

{
    "mcpServers": {
      "aider-mcp-server": {
        "command": "docker",
        "args": [ "run", "-i", "--rm",
          "--mount", "type=bind,source=<YOUR_FULL_PATH>",
          "-e", "TRANSPORT=stdio",
          "-e", "EDITOR_MODEL=gemini/gemini-2.5-pro-preview-03-25",
          "-e", "GEMINI_API_KEY=<YOUR_API_KEY>",
          "danielscholl/aider-mcp-server"
        ]
      }
    }
  }
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
CursorThe AI Code Editor
Tavily Mcp
WindsurfThe new purpose-built IDE to harness magic
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Amap Maps高德地图官方 MCP Server
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
ChatWiseThe second fastest AI chatbot™
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright McpPlaywright MCP server
DeepChatYour AI Partner on Desktop
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Serper MCP ServerA Serper MCP Server