Sponsored by Deepsite.site

mcp-chat

Created By
Flux1599 months ago
Open Source Generic MCP Client for testing & evaluating mcp servers and agents
Content

mcp-chat

Open Source Generic MCP Client for testing & evaluating mcp servers and agents

Quickstart

Make sure that you have ANTHROPIC_API_KEY exported in your environment or in a .env file in the root of the project. You can get an API key by signing up at the Anthropic Console keys page.

Simple use case that spawns an interactive chat prompt with the filesystem MCP server from CLI:

npx mcp-chat --server "npx -y @modelcontextprotocol/server-filesystem /Users/$USER/Desktop"

This will open up a chat prompt that you can use to interact with the servers and chat with an LLM.

Config

You can also just specify your claude_desktop_config.json (Mac):

npx mcp-chat --config "~/Library/Application Support/Claude/claude_desktop_config.json"

Or (Windows):

npx mcp-chat --config "%APPDATA%\Claude\claude_desktop_config.json"

Web mode

https://github.com/user-attachments/assets/b7e8a648-8084-4955-8cdf-fc6eb141572e

You can also run mcp-chat in web mode by specifying the --web flag (make sure to have ANTHROPIC_API_KEY exported in your environment):

npx mcp-chat --web

In web mode, you can start new chats, send messages to the model, and dynamically configure the mcp servers via the UI - no need to specify on the command line. In addition, chats created via the Web UI are saved to ~/.mcpchats/chats just like chats created via the CLI.

Features

  • Run via CLI in interactive mode or directly pass prompts with -p
  • Web mode to chat with models via a web interface --web
  • Connect to any MCP server (JS, Python, Docker) in production or during development
  • Choose between models with -m
  • Customize system prompt with --system
  • Saves chat history with settings in ~/.mcpchat/chats including web chats
  • Save and restore commands in ~/.mcpchat/history
  • View tool call output and arguments directly in chat to help debug mcp servers

CLI Usage

Run prompts via CLI with the -p flag:

npx mcp-chat --server "npx mcp-server-kubernetes" -p "List the pods in the default namespace"

This runs the prompt with the kubenertes mcp-server & exits after the response is received on stdout.

Choose a model to chat with via CLI with the -m flag:

npx mcp-chat --server "npx mcp-server-kubernetes" -m "claude-3.5"

Uses the model claude-3.5 to chat with. Note that currently only Anthropic models are supported.

Custom system prompt:

--system flag can be used to specify a system prompt:

npx mcp-chat --system "Explain the output to the user in pirate speak." --server "npx mcp-server-kubernetes" -p "List the pods in the default namespace"

For developers of mcp-servers

You can pass in a local build of a python or node mcp-server to test it out with mcp-chat:

Node JS:

# Directly executing built script
npx mcp-chat --server "/path/to/mcp-server-kubernetes/dist/index.js"
# Using node / bun
npx mcp-chat --server "node /path/to/mcp-server-kubernetes/dist/index.js"

Python:

# Python: Using uv
npx mcp-chat --server "uv --directory /path/to/mcp-server-weather/ run weather.py"
# Using python / python3 - make sure to run in venv or install deps globally
npx mcp-chat --server "/path/to/mcp-server-weather/weather.py"

Development

Install dependencies & run the CLI:

git clone https://github.com/Flux159/mcp-chat
bun install
bun run dev

To develop mcp-chat while connecting to an mcp-server, make a build & run the CLI with the server flag:

npm run build && node dist/index.js --server "npx mcp-server-kubernetes" -p "List the pods in the default namespace"

Testing:

bun run test

Building:

bun run build

Publishing:

bun run publish

Publishing Docker:

bun run dockerbuild

Project Structure

├── src/
│   ├── index.ts            # Main client implementation & CLI params
│   ├── constants.ts        # Default constants
│   ├── interactive.ts      # Interactive chat prompt handling & logic
├── test/                   # Test files
│   ├── cli.test.ts         # Test CLI params
│   ├── config.test.ts      # Test config file parsing

Publishing new release

Go to the releases page, click on "Draft New Release", click "Choose a tag" and create a new tag by typing out a new version number using "v{major}.{minor}.{patch}" semver format. Then, write a release title "Release v{major}.{minor}.{patch}" and description / changelog if necessary and click "Publish Release".

This will create a new tag which will trigger a new release build via the cd.yml workflow. Once successful, the new release will be published to npm. Note that there is no need to update the package.json version manually, as the workflow will automatically update the version number in the package.json file & push a commit to main.

License

MIT License

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
WindsurfThe new purpose-built IDE to harness magic
Serper MCP ServerA Serper MCP Server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Amap Maps高德地图官方 MCP Server
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Tavily Mcp
Playwright McpPlaywright MCP server
ChatWiseThe second fastest AI chatbot™
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
DeepChatYour AI Partner on Desktop
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
CursorThe AI Code Editor