Sponsored by Deepsite.site

DeepChat

Created By
ThinkInAI9 months ago
Your AI Partner on Desktop
Content

logo

DeepChat

Dolphins are good friends of whales, and DeepChat is your good assistant

中文 / English / 日本語

Reasoning

Latex

Artifacts support

Main Features

  • 🌐 Supports multiple cloud LLM providers: DeepSeek, OpenAI, Silicon Flow, Grok, Gemini, Anthropic, etc.
  • 🏠 Supports local model deployment: Ollama, with comprehensive management capabilities, allowing control and management of Ollama model downloads, deployments, and runs without command-line operations.
  • 🚀 Rich and easy-to-use chatbot capabilities
    • Complete Markdown rendering with excellent code block display.
    • Native support for simultaneous multi-session conversations; start new sessions without waiting for model generation to finish, maximizing efficiency.
    • Supports Artifacts rendering for diverse result presentation, significantly saving token consumption after MCP integration.
    • Messages support retry to generate multiple variations; conversations can be forked freely, ensuring there's always a suitable line of thought.
    • Supports rendering images, Mermaid diagrams, and other multi-modal content; includes Gemini's text-to-image capabilities.
    • Supports highlighting external information sources like search results within the content.
  • 🔍 Robust search extension capabilities
    • Built-in integration with leading search APIs like Brave Search via MCP mode, allowing the model to intelligently decide when to search.
    • Supports mainstream search engines like Google, Bing, Baidu, and Sogou Official Accounts search by simulating user web browsing, enabling the LLM to read search engines like a human.
    • Supports reading any search engine; simply configure a search assistant model to connect various search sources, whether internal networks, API-less engines, or vertical domain search engines, as information sources for the model.
  • 🔧 Excellent MCP (Model Controller Platform) support
    • Extremely user-friendly configuration interface.
    • Aesthetically pleasing and clear tool call display.
    • Detailed tool call debugging window with automatic formatting of tool parameters and return data.
    • Built-in Node.js runtime environment; npx-like services require no extra configuration.
    • Supports StreamableHTTP/SSE/Stdio protocols.
    • Supports inMemory services with built-in utilities like code execution, web information retrieval, and file operations; ready for most common use cases out-of-the-box without secondary installation.
    • Converts visual model capabilities into universally usable functions for any model via the built-in MCP service.
  • 💻 Multi-platform support: Windows, macOS, Linux.
  • 🎨 Beautiful and user-friendly interface, user-oriented design, meticulously themed light and dark modes.
  • 🔗 Rich DeepLink support: Initiate conversations via links for seamless integration with other applications. Also supports one-click installation of MCP services for simplicity and speed.
  • 🚑 Security-first design: Chat data and configuration data have reserved encryption interfaces and code obfuscation capabilities.
  • 🛡️ Privacy protection: Supports screen projection hiding, network proxies, and other privacy protection methods to reduce the risk of information leakage.
  • 💰 Business-friendly, embraces open source, based on the Apache License 2.0 protocol.

Currently Supported Model Providers


Ollama

Deepseek

Silicon

QwenLM

Doubao

MiniMax

Fireworks

PPIO

OpenAI

Gemini

GitHub Models

Moonshot

OpenRouter

Azure OpenAI

Qiniu

Grok

Compatible with any model provider in OpenAI/Gemini/Anthropic API format

Other Features

  • Support for local model management with Ollama
  • Support for local file processing
  • Artifacts support
  • Customizable search engines (parsed through models, no API adaptation required)
  • MCP support (built-in npx, no additional node environment installation needed)
  • Support for multimodality models
  • Local chat data backup and recovery
  • Compatibility with any model provider in OpenAI, Gemini, and Anthropic API formats

Development

Please read the Contribution Guidelines Windows and Linux are packaged by GitHub Action. For Mac-related signing and packaging, please refer to the Mac Release Guide.

Install dependencies

$ npm install
$ npm run installRuntime
# if got err: No module named 'distutils'
$ pip install setuptools
# for windows x64
$ npm install --cpu=x64 --os=win32 sharp
# for mac apple silicon
$ npm install --cpu=arm64 --os=darwin sharp
# for mac intel
$ npm install --cpu=x64 --os=darwin sharp
# for linux x64
$ npm install --cpu=x64 --os=linux sharp

Start development

$ npm run dev

Build

# For windows
$ npm run build:win

# For macOS
$ npm run build:mac

# For Linux
$ npm run build:linux

# Specify architecture packaging
$ npm run build:win:x64
$ npm run build:win:arm64
$ npm run build:mac:x64
$ npm run build:mac:arm64
$ npm run build:linux:x64
$ npm run build:linux:arm64

Star History

Star History Chart

Contributors

Thank you for considering contributing to deepchat! The contribution guide can be found in the Contribution Guidelines.

📃 License

LICENSE

Recommend Clients
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Continue⏩ Create, share, and use custom AI code assistants with our open-source IDE extensions and hub of models, rules, prompts, docs, and other building blocks
Cherry Studio🍒 Cherry Studio is a desktop client that supports for multiple LLM providers.
Refact.aiOpen-source AI Agent for VS Code and JetBrains that autonomously solves coding tasks end-to-end.
HyperChatHyperChat is a Chat client that strives for openness, utilizing APIs from various LLMs to achieve the best Chat experience, as well as implementing productivity tools through the MCP protocol.
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
DeepChatYour AI Partner on Desktop
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
CursorThe AI Code Editor
ZedCode at the speed of thought – Zed is a high-performance, multiplayer code editor from the creators of Atom and Tree-sitter.
chatmcpChatMCP is an AI chat client implementing the Model Context Protocol (MCP).
A Sleek AI Assistant & MCP Client5ire is a cross-platform desktop AI assistant, MCP client. It compatible with major service providers, supports local knowledge base and tools via model context protocol servers .
y-cli 🚀A Tiny Terminal Chat App for AI Models with MCP Client Support
BACHAI-TWITTER-API45Twitter的一些api mcp
LutraLutra is the first MCP compatible client built for everyone
Cline – #1 on OpenRouterAutonomous coding agent right in your IDE, capable of creating/editing files, executing commands, using the browser, and more with your permission every step of the way.
ChatWiseThe second fastest AI chatbot™
Roo Code (prev. Roo Cline)Roo Code (prev. Roo Cline) gives you a whole dev team of AI agents in your code editor.
WindsurfThe new purpose-built IDE to harness magic
MCP ConnectEnables cloud-based AI services to access local Stdio based MCP servers via HTTP requests
MCP PlaygroundCall MCP Server Tools Online