Sponsored by Deepsite.site

Chapel Support for MCP

Created By
DanilaFe6 months ago
A Model-Context-Protocol (MCP) server for AI Agents to make use of Chapel
Content

Chapel Support for MCP

A Model-Context-Protocol (MCP) server for the Chapel programming language, providing tools for working with Chapel code, accessing primers and examples, and integrating Chapel functionality with AI assistants and other tools.

What is Chapel?

Chapel is an open-source parallel programming language designed for productive parallel computing at scale. It aims to improve the programmability of parallel computers while matching or beating the performance and portability of current programming models like MPI, OpenMP, and CUDA.

Features

This MCP server provides the following Chapel support functionality:

  • Chapel Primer Access: Browse and access Chapel's educational primer examples
  • Code Compilation: Compile Chapel code directly through the API
  • Linting: Check Chapel code for style and best practices using chplcheck and apply automatic fixes
  • Smart CHPL_HOME Detection: Automatically locate Chapel's installation directory

Prerequisites

  • Python 3.13 or higher
  • Chapel programming language installed (see Chapel installation guide)
  • (Optional) chplcheck for linting functionality

Installation

  1. Clone this repository:

    git clone <repository-url>
    cd chapel-support
    
  2. Create and activate a virtual environment with UV:

    uv venv
    source .venv/bin/activate  # On Windows: .venv\Scripts\activate
    
  3. Synchronize the environment with project dependencies:

    uv sync
    

Configuration

The MCP server needs to know the location of your Chapel installation (CHPL_HOME). It will try to find it in this order:

  1. From the CHPL_HOME environment variable
  2. From a .env file in the project root
  3. By running chpl --print-chpl-home if the Chapel compiler is in your PATH

To use a .env file, create one in the project root with:

CHPL_HOME=/path/to/your/chapel/installation

See .env.example for a template.

Usage

Running the MCP Server

uv run chapel-support.py

This will start the MCP server in stdio transport mode using your virtual environment.

Integrating with AI Assistants or Tools

To use this MCP server with AI assistants or other tools, configure them to connect to this server. For example, in a client configuration file:

{
  "context_servers": {
    "chapel-support": {
      "command": {
        "path": "uv",
        "args": [
          "run",
          "--directory",
          "/path/to/chapel-support",
          "chapel-support.py"
        ],
        "env": {}
      },
      "settings": {}
    }
  }
}

Note: Adjust the directory path to the location of your chapel-support installation.

Available Tools

list_primers()

Gets the list of available Chapel primers.

Returns: A list of paths to primer files relative to CHPL_HOME.

get_primer(path: str)

Retrieves the content of a specific Chapel primer.

Parameters:

  • path: The path to the primer, as returned by list_primers()

Returns: The content of the primer as a string.

compile_program(program_text: str, program_name: str = "program.chpl")

Compiles a Chapel program.

Parameters:

  • program_text: The Chapel code to compile
  • program_name: Optional name for the program file (default: "program.chpl")

Returns: A tuple containing:

  • Success status (boolean)
  • Compiler output/errors (string)

list_chapel_lint_rules()

Lists all available Chapel linting rules from chplcheck.

Returns: A list of dictionaries with rule information:

  • name: Rule name
  • description: Rule description
  • is_default: Whether the rule is enabled by default

lint_chapel_code(program_text: str, program_name: str = "program.chpl", fix: bool = False, custom_rules: Optional[List[str]] = None)

Lints Chapel code and optionally applies fixes.

Parameters:

  • program_text: The Chapel code to lint
  • program_name: Optional name for the program file (default: "program.chpl")
  • fix: Whether to apply automatic fixes (default: False)
  • custom_rules: List of specific rules to enable (default: None, uses default rules)

Returns: A dictionary containing:

  • warnings: String containing linting warnings
  • fixed_code: The fixed code if fix=True
  • error: Error message if something went wrong
  • stats: Statistics about the linting process

Contributing

Contributions are welcome! Please feel free to submit pull requests or open issues.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Amap Maps高德地图官方 MCP Server
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
DeepChatYour AI Partner on Desktop
CursorThe AI Code Editor
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Serper MCP ServerA Serper MCP Server
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Playwright McpPlaywright MCP server
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
ChatWiseThe second fastest AI chatbot™
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
WindsurfThe new purpose-built IDE to harness magic
Tavily Mcp