Sponsored by Deepsite.site

CarrotAI

Created By
Xingsandesu9 months ago
CarrotAI is a cutting-edge AI agent application that delivers real-time streaming chat via Server-Sent Events (SSE) with built-in Model Control Protocol (MCP) integration. It supports concurrent connections to multiple SSE MCP servers and provides user interfaces in English, Chinese, and Japanese.
Content

CarrotAI

CarrotAI Logo

AI Agent with Multi-Server Streaming & Multi-Language Support

Flutter Frontend + FastAPI Backend

๐Ÿš€ Experience online now | SaaS Client download

๐Ÿ‡จ๐Ÿ‡ณ ้˜…่ฏปไธญๆ–‡ๆ–‡ๆกฃ


๐Ÿฅ• Introduction

CarrotAI is a cutting-edge AI agent application that delivers real-time streaming chat via Server-Sent Events (SSE) and streamable HTTP with built-in Model Control Protocol (MCP) integration. It supports concurrent connections to multiple SSE MCP servers and provides user interfaces in English, Chinese, and Japanese.

๐Ÿš€ Features

  • AI Agent: Real-time chat powered by SSE and MCP adapters for a seamless conversational experience.
  • Multi-Server Support: Connect to and call multiple SSE MCP servers simultaneously to aggregate intelligent responses.
  • Multi-Language: Full localization in English, ไธญๆ–‡ (Chinese), and ๆ—ฅๆœฌ่ชž (Japanese).
  • Deep Thinking Mode: Advanced analysis for complex or multi-step queries.
  • Authentication: Secure login/register flow using JWT tokens.
  • Responsive UI: Adaptive design for mobile, desktop, and web platforms.
  • Theme Customization: Light/dark mode, custom seed colors, and dynamic Material 3 theming via dynamic_color.
  • File Upload: Attach and parse files within conversations for richer context.

๐Ÿค– Supported Model APIs

  • DeepSeek: Advanced language model with strong reasoning capabilities

๐Ÿ› ๏ธ Tech Stack

Frontend

  • Framework: Flutter
  • State Management: Provider
  • UI: Material Design 3
  • Localization: flutter gen-l10n
  • Theming: dynamic_color

Backend

  • Framework: FastAPI
  • Streaming: Server-Sent Events (SSE)
  • AI Integration: DeepSeek LLM, MCP (Model Control Protocol)
  • Database: PostgreSQL + SQLAlchemy
  • Authentication: JSON Web Tokens
  • Migrations: Alembic
  • Deployment: Uvicorn & Gunicorn

๐Ÿ“‹ Prerequisites

  • Flutter SDK ^3.7.2
  • Python >=3.12
  • PostgreSQL

โšก Quick Start

Please make sure that it has been installed. uv

# Clone repository
git clone https://github.com/Xingsandesu/CarrotAI.git && cd CarrotAI

# Deal with environment variables
mv backend/.env.example backend/.env && mv .env.example .env

# Edit environment variables
vim .env
vim backend/.env

# Temporarily start PostgreSQL
docker-compose -f docker-compose.yml -f docker-compose.override.yml up -d postgres

# Backend setup
uv run backend/scripts/startup.py --user --email <email> --username <name> --password <password>

# Stop PostgreSQL
docker-compose -f docker-compose.yml -f docker-compose.override.yml down

# Deal with Config
vim config/

# Run Backend
docker compose up -d

๐Ÿ”ง Installation

Backend Setup

  1. Navigate to the backend directory:
    cd backend
    
  2. Create and activate a virtual environment:
    uv sync
    
  3. Copy the example environment file:
    cp .env.example .env
    
  4. Apply database migrations:
    uv run scripts/init_db.py && uv run scripts/init_config.py
    
  5. Run the server:
    python main.py       # Development mode
    python main.py prod  # Production mode with Gunicorn
    

Frontend Setup

  1. Return to the project root:
    cd ..
    
  2. Fetch Flutter dependencies:
    flutter pub get
    
  3. Generate localization files:
    flutter gen-l10n
    
  4. Launch the app:
    flutter run
    
  5. Build for web:
    flutter build web --wasm
    

๐ŸŒ Configuration

  • Frontend: Edit lib/core/config/app_config.dart for API endpoints and theming defaults.
  • Backend: Configure .env and backend/app/core/config.py for database, and MCP servers.

Backend Configuration Files

The backend uses JSON files located in backend/config/ to define models, MCP servers, and custom adapters. Below is the default folder structure:

backend/config/
โ”œโ”€โ”€ model_configs.json       # LLM model definitions and metadata
โ”œโ”€โ”€ mcp_servers.json         # SSE MCP server endpoints and env settings
โ””โ”€โ”€ app/                     # Custom adapter definitions
    โ””โ”€โ”€ duckduckgo-search.json

model_configs.json

Defines available LLM models for CarrotAI. Each entry includes:

  • id (string): Unique model identifier.
  • icon (string): Icon name for display.
  • translations (object): Localized names and descriptions (zh, en, ja).
  • exclusiveRules (object): Toggles and exclusion rules for features.

Example:

[
  {
    "id": "deepseek",
    "icon": "smart_toy_outlined",
    "translations": {
      "zh": { "name": "DeepSeek", "description": "ไธ“ๆณจไบŽๆทฑๅบฆๆ€่€ƒๅ’Œๅคๆ‚ๆŽจ็†็š„ๆปก่ก€ๆจกๅž‹" },
      "en": { "name": "DeepSeek", "description": "Powerful Chinese large model focused on deep thinking and complex reasoning" },
      "ja": { "name": "DeepSeek", "description": "ๆทฑใ„ๆ€่€ƒใจ่ค‡้›‘ใชๆŽจ่ซ–ใซ็‰นๅŒ–ใ—ใŸๅผทๅŠ›ใชไธญๅ›ฝ่ชžๅคง่ฆๆจกใƒขใƒ‡ใƒซ" }
    },
    "exclusiveRules": {
      "deepThinking": { "enabled": true, "excludes": ["mcpServices"] },
      "mcpServices": { "enabled": true, "excludes": ["deepThinking"] }
    }
  }
]

mcp_servers.json

Specifies SSE Model Control Protocol (MCP) endpoints. Format:

  • Keys: service names.
  • url (string): SSE endpoint URL.
  • env (object): Environment variables for the adapter.

Example:

{
  "serviceA": {
    "url": "http://localhost:10000/sse",
    "env": {
      "API_KEY": "your_api_key"
    }
  }
}

Custom Adapters (app/*.json)

Place custom MCP adapters in backend/config/app/. Each file defines:

  • id (string): Adapter identifier.
  • icon (string): Emoji or icon name.
  • mcpServer (object): Same structure as entries in mcp_servers.json.
  • translations (object): Localized UI metadata.

Example (duckduckgo-search.json):

{
  "id": "duckduckgo-search",
  "icon": "๐Ÿ”",
  "mcpServer": {
    "url": "http://localhost:10000/duckduckgo-search",
    "env": {}
  },
  "transportType": "sse",
  "translations": {
    "en": { "name": "DuckDuckGo Search", "type": "Search Tool", "description": "Use DuckDuckGo search engine for secure and private web searches" },
    "zh": { "name": "DuckDuckGoๆœ็ดข", "type": "ๆœ็ดขๅทฅๅ…ท", "description": "ไฝฟ็”จDuckDuckGoๆœ็ดขๅผ•ๆ“Ž่ฟ›่กŒๅฎ‰ๅ…จใ€็งๅฏ†็š„็ฝ‘็ปœๆœ็ดข" },
    "ja": { "name": "DuckDuckGoๆคœ็ดข", "type": "ๆคœ็ดขใƒ„ใƒผใƒซ", "description": "DuckDuckGoๆคœ็ดขใ‚จใƒณใ‚ธใƒณใ‚’ไฝฟ็”จใ—ใฆๅฎ‰ๅ…จใงใƒ—ใƒฉใ‚คใƒ™ใƒผใƒˆใชใ‚ฆใ‚งใƒ–ๆคœ็ดขใ‚’่กŒใ„ใพใ™" }
  }
}

Usage

  1. Initialize default configurations:
    uv run scripts/init_config.py
    
  2. Modify JSON files under backend/config/ to add or update models and endpoints.
  3. Restart the backend server to apply changes.

๐Ÿ”ง Environment Variables

Backend (.env)

KeyDescriptionDefault
DATABASE_URLPostgreSQL connection URLrequired
BACKEND_CORS_ORIGINSAllowed CORS origins (comma-separated)[]
MCP_SERVERSJSON list of SSE MCP server endpointsrequired
SECRET_KEYJWT secret keyrequired

Frontend (lib/core/config/app_config.dart)

static String get baseUrl => "http://127.0.0.1:8000";

๐Ÿ’ก Usage

  1. Launch backend and frontend as shown in Quick Start.
  2. Open the app in your browser or mobile emulator.
  3. Register or login to obtain a JWT token.
  4. Use deep thinking mode or default chat mode to interact with the AI agent.
  5. Switch between MCP servers or add new endpoints under Settings.

๐Ÿ”— API Reference

Access the interactive Swagger UI at:

http://127.0.0.1:8000/docs

๐Ÿ›ฃ๏ธ Roadmap

  • SSE multi-server support
  • Multi-language (EN, ไธญๆ–‡, ๆ—ฅๆœฌ่ชž)
  • Docker Compose setup
  • streamable HTTP support
  • Local Stdio multi-server support
  • Local OCR support
  • Support for more formats of the upload interface
  • Frontend custom prompts
  • More model support
  • More language support

๐Ÿ›ก๏ธ Security

  • Authentication: All backend endpoints secured with JWT; tokens stored securely in encrypted storage.
  • Data Protection: Use HTTPS in production; configure allowed CORS origins via BACKEND_CORS_ORIGINS in .env.
  • Secret Management: Define SECRET_KEY in .env; ensure no secrets are committed to source control.

๐Ÿ” Monitoring & Logging

  • Server Logs: Configured in gunicorn.conf.py; access and error logs in logs/.
  • Application Logs: Uses Loguru for structured logging; frontend disables debugPrint in release mode.

๐Ÿš€ Performance & Optimization

  • Caching: Frontend caches static assets; backend uses async connection pooling for PostgreSQL.
  • Bundle Size: Web artifact built with --wasm for optimized delivery.

๐Ÿ—‚๏ธ Changelog

All notable changes are documented in CHANGELOG.md.

๐Ÿ“ฑ Screenshots

๐Ÿค Contributing

Contributions are welcome! Please open a Pull Request with your suggestions.

๐Ÿ“„ License

This project is licensed under the CarrotAI Open Source License. See the LICENSE file for details.

Recommend Clients
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Refact.aiOpen-source AI Agent for VS Code and JetBrains that autonomously solves coding tasks end-to-end.
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
HyperChatHyperChat is a Chat client that strives for openness, utilizing APIs from various LLMs to achieve the best Chat experience, as well as implementing productivity tools through the MCP protocol.
CursorThe AI Code Editor
DeepChatYour AI Partner on Desktop
y-cli ๐Ÿš€A Tiny Terminal Chat App for AI Models with MCP Client Support
Continueโฉ Create, share, and use custom AI code assistants with our open-source IDE extensions and hub of models, rules, prompts, docs, and other building blocks
ZedCode at the speed of thought โ€“ Zed is a high-performance, multiplayer code editor from the creators of Atom and Tree-sitter.
Roo Code (prev. Roo Cline)Roo Code (prev. Roo Cline) gives you a whole dev team of AI agents in your code editor.
ChatWiseThe second fastest AI chatbotโ„ข
LutraLutra is the first MCP compatible client built for everyone
WindsurfThe new purpose-built IDE to harness magic
MODELSCOPE---MODELSCOPE-PLATFORM-MCP-SERVICES
A Sleek AI Assistant & MCP Client5ire is a cross-platform desktop AI assistant, MCP client. It compatible with major service providers, supports local knowledge base and tools via model context protocol servers .
MCP PlaygroundCall MCP Server Tools Online
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Cline โ€“ #1 on OpenRouterAutonomous coding agent right in your IDE, capable of creating/editing files, executing commands, using the browser, and more with your permission every step of the way.
Cherry Studio๐Ÿ’ Cherry Studio is a desktop client that supports for multiple LLM providers.
chatmcpChatMCP is an AI chat client implementing the Model Context Protocol (MCP).
MCP ConnectEnables cloud-based AI services to access local Stdio based MCP servers via HTTP requests