Sponsored by Deepsite.site

Make CoPilot an Interactive Programmer

Created By
BetterThanTomorrow7 months ago
VS Code AI Agent Interactive Programming. Tools for CoPIlot and other assistants. Can also be used asan MCP server.
Content

Make CoPilot an Interactive Programmer

Clojure Tools for CoPilot

It is also an MCP Server for Calva

(Parts of this README is written by Claude Sonnet. Pardon any marketing language. I will clean up.)

VS Code Extension Issues License

An VS Code Language model extension for Calva, the Clojure/ClojureScript extension for VS Code, enabling AI assistants to harness the power of the REPL.

This extension exposes the AI tools both to CoPilot directly, using the VS Code Language Model API, and via an optional MCP server for any AI assistants/agents.

Features

  • Tool: Evaluate Code (disabled by default), access to the Clojure REPL to evaluate code at will
  • Tool: Bracket Balancer Helps the model get the bracket balance right (powered by Parinfer)
  • Tool: Symbol info lookup, the AI can look up symbols it is interested in, and will get doc strings, argument info etcetera
  • Tool: clojuredocs.org lookup, docs, examples, and see also information on Clojure core-ish symbols
  • Resource: Symbol info lookup, (a bit experimental) same as the tool
  • Resource: clojuredocs.org lookup, (a bit experimental) same as the tool

Evaluation power is opt-in

Since evaluating Clojure code could be a bit risky, the server defaults to this being disabled, so you can use the server for other things. Search for Calva MCP in VS Code Settings to enable it.

Note that there are several layers to the security model here. This server starts with evaluation powers disables, and compliant MCP servers will default to low trust mode and ask for your confirmation every time the LLM wants to use the tool. Full YOLO mode is enabled if you enable the tool in the Calva MCP settings, and configure your AI client to be allowed to use it without asking.

Why Calva Backseat Driver?

"I wish Copilot could actually run my Clojure code instead of just guessing what it might do."

The Calva Backseat Driver transforms AI coding assistants from static code generators into interactive programming partners by giving them access to your REPL. (Please be mindful about the implications of that before you start using it.)

Turn your AI Agent into an Interactive Programming partner

Tired of AI tools that write plausible-looking Clojure that falls apart at runtime? Calva Backseat Driver lets your AI assistant:

  • Evaluate code in your actual environment - No more "this might work" guesses
  • See real data structures, not just predict their shape
  • Test functions with real inputs before suggesting them
  • Debug alongside you with access to runtime errors
  • Learn from your codebase's actual behavior

For Clojurians who value Interactive Programming

As Clojure developers, we know the REPL isn't just a console - it's the center of our workflow. Now your AI assistant can join that workflow, understanding your data and functions as they actually exist, not just as they appear in static code.

In test-projects/example/AI_INTERACTIVE_PROGRAMMING.md you'll find an attempt to prompt the AI to leverage the REPL for interactive programming. (With varying success, help with this is much appreciated!)

Getting Started

Prerequisites

Code generation instructions

This is something we will have to figure out and discover together. Right now I include this in the github.copilot.chat.codeGeneration.instructions array, and it seems to work pretty well.

      {
        "text": "You are a Seniour Clojure developer who know how to leverage the Calva Backseat Driver tools to improve your assistance. Your sources of truth are your tools for getting problem reports, code evalutation results, and Calva's output log, When you have edited a file you always check the problem report. Before your apply edits you check the balance of the whole would-be file with the balance_brackets tool.",
        "language": "clojure"
      },

The Backset Driver extension provides this as defailts instructions, in case you don't have any yet.

  • Afaik, there is no way for the extension to describe itself to CoPilot.
  • For MCP clients, the server provides a description of itself.

Configuration (if using MCP Server)

Backseat Driver is a per-project MCP server, so should be configured on the project level. (If the assistant you are using allows it. Windsurf doesn't.)

The MCP server is running as a plain socket server in the VS Code Extension Host, writing out a port file when it starts. Then the MCP client needs to start a stdio relay/proxy/wrapper thar it will talk to. The wrapper script takes the port file as an argument. Because of these and other reasons, there will be one Calva Backseat Driver per workspace, and the port file will be written to the .calva directory in the workspace root.

  1. Open your project
  2. Start the Calva MCP socket server
    • This will create a port file: ${workspaceFolder}/.calva/mcp-server/port

    • When the server is started, a confirmation dialog will be shown. This dialog has a button which lets you copy the command to start the stdio wrapper to the clipboard.

      MCP Server Started message with Copy Command button

  3. Add the MCP server config (will vary depending on MCP Client)

Cursor configuration

Cursor supports project level config.

In you project's .cursor/mcp.json add a "backseat-driver" entry like so:

{
  "mcpServers": {
    "backseat-driver": {
      "command": "node",
      "args": [
        "<absolute path to wrapper script>",
        "<absolute path to port file (which points to your project's .calva/mcp-server/port)"
      ]
    }
  }
}

Cursor will detect the server config and offer to start it.

You may want to check the Cursor MCP docs.

Windsurf configuration

Windsurf can use the Backseat Driver via its MCP server. However, it is a bit clunky, to say the least. Windsurf doesn't support workspace configurations for MCP servers, so they are only global. This means:

  • You can in practice only have one Backseat Driver backed project
  • You must use absolute paths for the stdio command port file argument

Cursor's configuration file has the same shape as Cursor's, located at: ~/.codeium/windsurf/mcp_config.json (at least on my machine).

The Windsurf AI assistant doesn't know about its MCP configurations and will keep trying to create MCP configs for CoPilot. Which is silly, because it won't work for Windsurf, and CoPilot doesn't need it.

Clunk: At startup, even with the MCP server set to auto-start, Windsurf often refreshes its MCP servers quicker than the MCP server starts. You may need to refresh the tools in Windsurf. However, Windsurf doesn't seem to handle refreshing more than once well. It just keeps spinning the refresh button.

IMPORTANT: Windsurf uses MCP tools without checking with the user by default. This is fine for 3 out of 4 of the Backseat Driver tools, but for the REPL tool it is less ideal. I think some Windsurf user should report this non-compliance with MCP as an issue.

Other MCP client?

Please add configuration for other AI clients! 🙏

Using

  1. Connect Calva to your Clojure/ClojureScript project
  2. If you want the AI to have full REPL powers, enable this in settings.

All tools can be referenced in the chat:

  • #eval-clojure
  • #clojure-symbol
  • #clojuredocs
  • #calva-output

How It Works (evaluating code)

  1. When your AI assistant needs to understand your code better, it can execute it in your REPL
  2. The results flow back to the AI, giving it insight into actual data shapes and function behavior
  3. This creates a powerful feedback loop where suggestions improve based on runtime information
  4. You remain in control of this process, benefiting from an AI partner that truly understands your running code
flowchart TD
    subgraph InteractiveProgrammers["Interactive Programmers"]
        User([You])
        AIAgent([AI Agent])
        User <--> AIAgent
    end

    subgraph VSCode["VS Code"]

        MCP["Calva Backseat Driver"]

        subgraph Calva["Calva"]
            REPLClient["REPL Client"]
        end

        subgraph Project["Clojure Project"]

            subgraph RunningApp["Running Application"]
                SourceCode["Source Code"]
                REPL["REPL"]
            end
        end
    end

    User --> SourceCode
    User --> Calva
    REPLClient --> REPL
    AIAgent --> SourceCode
    AIAgent --> MCP
    MCP --> Calva

    classDef users fill:#ffffff,stroke:#63b132,stroke-width:1px,color:#63b132;
    classDef programmers fill:#63b132,stroke:#000000,stroke-width:2px,color:#ffffff;
    classDef vscode fill:#0078d7,stroke:#000000,stroke-width:1px,color:#ffffff;
    classDef calva fill:#df793b,stroke:#ffffff,stroke-width:1px,color:#ffffff;
    classDef highlight fill:#ffffff,stroke:#000000,stroke-width:1px,color:#000000;
    classDef dark fill:#333333,stroke:#ffffff,stroke-width:1px,color:#ffffff;
    classDef repl fill:#5881d8,stroke:#ffffff,stroke-width:1px,color:#ffffff;
    classDef running fill:#63b132,stroke:#ffffff,stroke-width:1px,color:#ffffff;
    classDef project fill:#888888,stroke:#ffffff,stroke-width:1px,color:#ffffff;

    class User,AIAgent users;
    class VSCode vscode;
    class Calva,MCP calva;
    class REPLClient repl;
    class Project project;
    class SourceCode dark;
    class RunningApp running;
    class REPL repl;
    class InteractiveProgrammers programmers;

MCP

Calva Backseat Driver implements the Model Context Protocol (MCP), creating a bridge between AI assistants and your REPL:

WIP

This is a super early, bare bones, MCP server.

The “plan” (hope) is that we will expose much more of Calva's features. Please let us now what features you would like to see.

Contributing

Contributions are welcome! Issues, PRs, whatever. Before a PR, we appreciate an issue stating the problem being solved. You may also want to reach out discussing the issue before starting to work on it.

License 🍻🗽

MIT

You are welcome to show me you like my work using this link:

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
ChatWiseThe second fastest AI chatbot™
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
WindsurfThe new purpose-built IDE to harness magic
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Amap Maps高德地图官方 MCP Server
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
DeepChatYour AI Partner on Desktop
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Tavily Mcp
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Serper MCP ServerA Serper MCP Server
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Playwright McpPlaywright MCP server
CursorThe AI Code Editor