Sponsored by Deepsite.site

MCP-CLI-HOST

Created By
vincent-pli8 months ago
A CLI host application that enables Large Language Models (LLMs) to interact with external tools through the Model Context Protocol (MCP).
Content

MCPCLIHost 🤖

A CLI host application that enables Large Language Models (LLMs) to interact with external tools through the Model Context Protocol (MCP). Currently supports Openai, Azure Openai, Deepseek and Ollama models.

English | 简体中文

What it looks like: 🤠

https://github.com/vincent-pli/mcp-cli-host/blob/main/mcp-cli-host.png

Features ✨

  • Interactive conversations with multipe LLM models
  • Support for multiple concurrent MCP servers
  • Dynamic tool discovery and integration
  • Tool calling capabilities for both model types
  • Configurable MCP server locations and arguments
  • Consistent command interface across model types
  • Configurable message history window for context management
  • Monitor/trace error from server side
  • Support sampling
  • Support Roots

Environment Setup 🔧

  1. For Openai and Deepseek:
export OPENAI_API_KEY='your-api-key'

By default for Openai the base_url is "https://api.openai.com/v1" For deepseek it's "https://api.deepseek.com", you can change it by --base-url

  1. For Ollama, need setup firstly:
ollama pull mistral
  • Ensure Ollama is running:
ollama serve
  1. For Azure Openai:
export AZURE_OPENAI_DEPLOYMENT='your-azure-deployment'
export AZURE_OPENAI_API_KEY='your-azure-openai-api-key'
export AZURE_OPENAI_API_VERSION='your-azure-openai-api-version'
export AZURE_OPENAI_ENDPOINT='your-azure-openai-endpoint'

Installation 📦

pip install mcp-cli-host

Configuration ⚙️

MCPCLIHost will automatically find configuration file at ~/.mcp.json. You can also specify a custom location using the --config flag:

{
  "mcpServers": {
    "sqlite": {
      "command": "uvx",
      "args": [
        "mcp-server-sqlite",
        "--db-path",
        "/tmp/foo.db"
      ]
    },
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/tmp"
      ]
    }
  }
}

Each MCP server entry requires:

  • command: The command to run (e.g., uvx, npx)
  • args: Array of arguments for the command:
    • For SQLite server: mcp-server-sqlite with database path
    • For filesystem server: @modelcontextprotocol/server-filesystem with directory path

Usage 🚀

MCPCLIHost is a CLI tool that allows you to interact with various AI models through a unified interface. It supports various tools through MCP servers.

Available Models

Models can be specified using the --model (-m) flag:

  • Deepseek: deepseek:deepseek-chat
  • OpenAI: openai:gpt-4
  • Ollama models: ollama:modelname
  • Azure Openai: azure:gpt-4-0613

Examples

# Use Ollama with Qwen model
mcpclihost -m ollama:qwen2.5:3b

# Use Deepseek
mcpclihost -m deepseek:deepseek-chat

Flags

  • --config string: Config file location (default is $HOME/mcp.json)
  • --debug: Enable debug logging
  • --message-window int: Number of messages to keep in context (default: 10)
  • -m, --model string: Model to use (format: provider:model) (default "anthropic:claude-3-5-sonnet-latest")
  • --base-url string: Base URL for OpenAI API (defaults to api.openai.com)

Interactive Commands

While chatting, you can use:

  • /help: Show available commands
  • /tools: List all available tools
  • /servers: List configured MCP servers
  • /history: Display conversation history
  • Ctrl+C: Exit at any time

MCP Server Compatibility 🔌

MCPCliHost can work with any MCP-compliant server. For examples and reference implementations, see the MCP Servers Repository.

License 📄

This project is licensed under the Apache 2.0 License - see the LICENSE file for details.

Recommend Clients
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
WindsurfThe new purpose-built IDE to harness magic
HyperChatHyperChat is a Chat client that strives for openness, utilizing APIs from various LLMs to achieve the best Chat experience, as well as implementing productivity tools through the MCP protocol.
Continue⏩ Create, share, and use custom AI code assistants with our open-source IDE extensions and hub of models, rules, prompts, docs, and other building blocks
ZedCode at the speed of thought – Zed is a high-performance, multiplayer code editor from the creators of Atom and Tree-sitter.
LutraLutra is the first MCP compatible client built for everyone
y-cli 🚀A Tiny Terminal Chat App for AI Models with MCP Client Support
MCP PlaygroundCall MCP Server Tools Online
Cline – #1 on OpenRouterAutonomous coding agent right in your IDE, capable of creating/editing files, executing commands, using the browser, and more with your permission every step of the way.
Cherry Studio🍒 Cherry Studio is a desktop client that supports for multiple LLM providers.
DeepChatYour AI Partner on Desktop
Refact.aiOpen-source AI Agent for VS Code and JetBrains that autonomously solves coding tasks end-to-end.
ChatWiseThe second fastest AI chatbot™
MCP ConnectEnables cloud-based AI services to access local Stdio based MCP servers via HTTP requests
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
A Sleek AI Assistant & MCP Client5ire is a cross-platform desktop AI assistant, MCP client. It compatible with major service providers, supports local knowledge base and tools via model context protocol servers .
chatmcpChatMCP is an AI chat client implementing the Model Context Protocol (MCP).
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
MODELSCOPE---MODELSCOPE-PLATFORM-MCP-SERVICES
CursorThe AI Code Editor
Roo Code (prev. Roo Cline)Roo Code (prev. Roo Cline) gives you a whole dev team of AI agents in your code editor.