Sponsored by Deepsite.site

Dive AI Agent ๐Ÿคฟ ๐Ÿค–

Created By
OpenAgentPlatform10 months ago
Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. โœจ
Content

Dive AI Agent ๐Ÿคฟ ๐Ÿค–

GitHub stars GitHub forks GitHub watchers GitHub repo size GitHub language count GitHub top language GitHub last commit Discord Twitter Follow

Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. โœจ

Dive Demo

Features ๐ŸŽฏ

  • ๐ŸŒ Universal LLM Support: Compatible with ChatGPT, Anthropic, Ollama and OpenAI-compatible models
  • ๐Ÿ’ป Cross-Platform: Available for Windows, MacOS, and Linux
  • ๐Ÿ”„ Model Context Protocol: Enabling seamless MCP AI agent integration on both stdio and SSE mode
  • ๐ŸŒ Multi-Language Support: Traditional Chinese, Simplified Chinese, English, Spanish, Japanese with more coming soon
  • โš™๏ธ Advanced API Management: Multiple API keys and model switching support
  • ๐Ÿ’ก Custom Instructions: Personalized system prompts for tailored AI behavior
  • ๐Ÿ”„ Auto-Update Mechanism: Automatically checks for and installs the latest application updates

Recent updates(2025/4/21)

  • ๐Ÿš€ Dive MCP Host v0.8.0: DiveHost rewritten in Python is now a separate project at dive-mcp-host
  • โš™๏ธ Enhanced LLM Settings: Add, modify, delete LLM Provider API Keys and custom Model IDs
  • ๐Ÿ” Model Validation: Validate or skip validation for models supporting Tool/Function calling
  • ๐Ÿ”ง Improved MCP Configuration: Add, edit, and delete MCP tools directly from the UI
  • ๐ŸŒ Japanese Translation: Added Japanese language support
  • ๐Ÿค– Extended Model Support: Added Google Gemini and Mistral AI models integration

Important: Due to DiveHost migration from TypeScript to Python in v0.8.0, configuration files and chat history records will not be automatically upgraded. If you need to access your old data after upgrading, you can still downgrade to a previous version.

Download and Install โฌ‡๏ธ

Get the latest version of Dive: Download

For Windows users: ๐ŸชŸ

  • Download the .exe version
  • Python and Node.js environments are pre-installed

For MacOS users: ๐ŸŽ

  • Download the .dmg version
  • You need to install Python and Node.js (with npx uvx) environments yourself
  • Follow the installation prompts to complete setup

For Linux users: ๐Ÿง

  • Download the .AppImage version
  • You need to install Python and Node.js (with npx uvx) environments yourself
  • For Ubuntu/Debian users:
    • You may need to add --no-sandbox parameter
    • Or modify system settings to allow sandbox
    • Run chmod +x to make the AppImage executable

MCP Tips

While the system comes with a default echo MCP Server, your LLM can access more powerful tools through MCP. Here's how to get started with two beginner-friendly tools: Fetch and Youtube-dl.

Set MCP

Quick Setup

Add this JSON configuration to your Dive MCP settings to enable both tools:

 "mcpServers":{
    "fetch": {
      "command": "uvx",
      "args": [
        "mcp-server-fetch",
        "--ignore-robots-txt"
      ],
      "enabled": true
    },
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/path/to/allowed/files"
      ],
      "enabled": true
    },
    "youtubedl": {
      "command": "npx",
      "args": [
        "@kevinwatt/yt-dlp-mcp"
      ],
      "enabled": true
    }
  }

Using SSE Server for MCP

You can also connect to an external MCP server via SSE (Server-Sent Events). Add this configuration to your Dive MCP settings:

{
  "mcpServers": {
    "MCP_SERVER_NAME": {
      "enabled": true,
      "transport": "sse",
      "url": "YOUR_SSE_SERVER_URL"
    }
  }
}

Additional Setup for yt-dlp-mcp

yt-dlp-mcp requires the yt-dlp package. Install it based on your operating system:

Windows

winget install yt-dlp

MacOS

brew install yt-dlp

Linux

pip install yt-dlp

Build ๐Ÿ› ๏ธ

See BUILD.md for more details.

Connect With Us ๐ŸŒ

Recommend Clients
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Cherry Studio๐Ÿ’ Cherry Studio is a desktop client that supports for multiple LLM providers.
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
MCP PlaygroundCall MCP Server Tools Online
Continueโฉ Create, share, and use custom AI code assistants with our open-source IDE extensions and hub of models, rules, prompts, docs, and other building blocks
VISBOOM
ZedCode at the speed of thought โ€“ Zed is a high-performance, multiplayer code editor from the creators of Atom and Tree-sitter.
LutraLutra is the first MCP compatible client built for everyone
y-cli ๐Ÿš€A Tiny Terminal Chat App for AI Models with MCP Client Support
MCP ConnectEnables cloud-based AI services to access local Stdio based MCP servers via HTTP requests
Refact.aiOpen-source AI Agent for VS Code and JetBrains that autonomously solves coding tasks end-to-end.
A Sleek AI Assistant & MCP Client5ire is a cross-platform desktop AI assistant, MCP client. It compatible with major service providers, supports local knowledge base and tools via model context protocol servers .
WindsurfThe new purpose-built IDE to harness magic
Cline โ€“ #1 on OpenRouterAutonomous coding agent right in your IDE, capable of creating/editing files, executing commands, using the browser, and more with your permission every step of the way.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
HyperChatHyperChat is a Chat client that strives for openness, utilizing APIs from various LLMs to achieve the best Chat experience, as well as implementing productivity tools through the MCP protocol.
Roo Code (prev. Roo Cline)Roo Code (prev. Roo Cline) gives you a whole dev team of AI agents in your code editor.
DeepChatYour AI Partner on Desktop
ChatWiseThe second fastest AI chatbotโ„ข
chatmcpChatMCP is an AI chat client implementing the Model Context Protocol (MCP).
CursorThe AI Code Editor