Sponsored by Deepsite.site

Background Process MCP

Created By
waylaidwanderer3 months ago
A Model Context Protocol (MCP) server that provides background process management capabilities. This server enables LLMs to start, stop, and monitor long-running command-line processes.
Content

Background Process MCP

A Model Context Protocol (MCP) server that provides background process management capabilities. This server enables LLMs to start, stop, and monitor long-running command-line processes.

Motivation

Some AI agents, like Claude Code, can manage background processes natively, but many others can't. This project provides that capability as a standard tool for other agents like Google's Gemini CLI. It works as a separate service, making long-running task management available to a wider range of agents. I also added a TUI because I wanted to be able to monitor the processes myself.

Screenshot

TUI Screenshot

Getting Started

To get started, install the Background Process MCP server in your preferred client.

Standard Config

This configuration works for most MCP clients:

{
  "mcpServers": {
    "backgroundProcess": {
      "command": "npx",
      "args": [
        "@waylaidwanderer/background-process-mcp@latest"
      ]
    }
  }
}

To connect to a standalone server, add the --port argument to the args array (e.g., ...mcp@latest", "--port", "31337"]).

Claude Code

Use the Claude Code CLI to add the Background Process MCP server:

claude mcp add backgroundProcess npx @waylaidwanderer/background-process-mcp@latest
Claude Desktop

Follow the MCP install guide, use the standard config above.

Codex

Create or edit the configuration file ~/.codex/config.toml and add:

[mcp_servers.backgroundProcess]
command = "npx"
args = ["@waylaidwanderer/background-process-mcp@latest"]

For more information, see the Codex MCP documentation.

Cursor

Click the button to install:

Install in Cursor

Or install manually:

Go to Cursor Settings -> MCP -> Add new MCP Server. Name it backgroundProcess, use command type with the command npx @waylaidwanderer/background-process-mcp@latest.

Gemini CLI

Follow the MCP install guide, use the standard config above.

Goose

Click the button to install:

Install in Goose

Or install manually:

Go to Advanced settings -> Extensions -> Add custom extension. Name it backgroundProcess, use type STDIO, and set the command to npx @waylaidwanderer/background-process-mcp@latest. Click "Add Extension".

LM Studio

Click the button to install:

Add MCP Server backgroundProcess to LM Studio

Or install manually:

Go to Program in the right sidebar -> Install -> Edit mcp.json. Use the standard config above.

opencode

Follow the MCP Servers documentation. For example in ~/.config/opencode/opencode.json:

{
  "$schema": "https://opencode.ai/config.json",
  "mcp": {
    "backgroundProcess": {
      "type": "local",
      "command": [
        "npx",
        "@waylaidwanderer/background-process-mcp@latest"
      ],
      "enabled": true
    }
  }
}
Qodo Gen

Open Qodo Gen chat panel in VSCode or IntelliJ → Connect more tools → + Add new MCP → Paste the standard config above.

Click Save.

VS Code (for GitHub Copilot)

Click the button to install:

Install in VS Code Install in VS Code Insiders

Or install manually:

Follow the MCP install guide, use the standard config above. You can also install the server using the VS Code CLI:

# For VS Code
code --add-mcp '{"name":"backgroundProcess","command":"npx","args":["@waylaidwanderer/background-process-mcp@latest"]}'
Windsurf

Follow Windsurf MCP documentation. Use the standard config above.

Tools

The following tools are exposed by the MCP server.

Process Management
  • start_process

    • Description: Starts a new process in the background.
    • Parameters:
      • command (string): The shell command to execute.
    • Returns: A confirmation message with the new process ID.
  • stop_process

    • Description: Stops a running process.
    • Parameters:
      • processId (string): The UUID of the process to stop.
    • Returns: A confirmation message.
  • clear_process

    • Description: Clears a stopped process from the list.
    • Parameters:
      • processId (string): The UUID of the process to clear.
    • Returns: A confirmation message.
  • get_process_output

    • Description: Gets the recent output for a process. Can specify head for the first N lines or tail for the last N lines.
    • Parameters:
      • processId (string): The UUID of the process to get output from.
      • head (number, optional): The number of lines to get from the beginning of the output.
      • tail (number, optional): The number of lines to get from the end of the output.
    • Returns: The requested process output as a single string.
  • list_processes

    • Description: Gets a list of all processes being managed by the Core Service.
    • Parameters: None
    • Returns: A JSON string representing an array of all process states.
  • get_server_status

    • Description: Gets the current status of the Core Service.
    • Parameters: None
    • Returns: A JSON string containing server status information (version, port, PID, uptime, process counts).

Architecture

The project has three components:

  1. Core Service (src/server.ts): A standalone WebSocket server that uses node-pty to manage child process lifecycles. It is the single source of truth for all process states. It is designed to be standalone so that other clients beyond the official TUI and MCP can be built for it.

  2. MCP Client (src/mcp.ts): Exposes the Core Service functionality as a set of tools for an LLM agent. It can connect to an existing service or spawn a new one.

  3. TUI Client (src/tui.ts): An ink-based terminal UI that connects to the Core Service to display process information and accept user commands.

Manual Usage

If you wish to run the server and TUI manually outside of an MCP client, you can use the following commands.

For a shorter command, you can install the package globally:

pnpm add -g @waylaidwanderer/background-process-mcp

This will give you access to the bgpm command.

1. Run the Core Service

Start the background service manually:

# With npx
npx @waylaidwanderer/background-process-mcp server

# Or, if installed globally
bgpm server

The server will listen on an available port (defaulting to 31337) and output a JSON handshake with the connection details.

2. Use the TUI

Connect the TUI to a running server via its port:

# With npx
npx @waylaidwanderer/background-process-mcp ui --port <port_number>

# Or, if installed globally
bgpm ui --port <port_number>

Server Config

{
  "mcpServers": {
    "backgroundProcess": {
      "command": "npx",
      "args": [
        "@waylaidwanderer/background-process-mcp@latest"
      ]
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
DeepChatYour AI Partner on Desktop
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
WindsurfThe new purpose-built IDE to harness magic
Amap Maps高德地图官方 MCP Server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Playwright McpPlaywright MCP server
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
ChatWiseThe second fastest AI chatbot™
Tavily Mcp
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
CursorThe AI Code Editor
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Serper MCP ServerA Serper MCP Server