Sponsored by Deepsite.site

Centroid

Created By
NeuclaiLabs9 months ago
A platform for managing your local MCP servers
Content

Centroid

A chat-based open source development platform for API discovery and testing.

License: Apache 2.0 GitHub Release GitHub Issues Repo Size Commit Activity Last Commit Discord

Centroid Demo

🚧 Development Status: Centroid is in active development (alpha). While fully functional, you may encounter breaking changes as the platform evolves. We encourage you to try it out and provide feedback!

✨ Key Features

Centroid re-imagines API workflows through the power of natural conversation. Think of it as "Postman meets ChatGPT" - a modern, intuitive approach to API interaction that lets you:

  • 💬 Chat with API Collections: Import and interact with your APIs through natural conversation

    • Support for OpenAPI/Swagger specifications
    • Import Postman collections
    • Understand and explore API endpoints through chat
  • 🚀 Execute API Endpoints: Test and run API endpoints directly from the chat interface

    • Send requests with custom parameters
    • View response data in real-time
    • Save and reuse API configurations
  • 🤖 Flexible LLM Support: Works with any OpenAI-compatible API

    • Use OpenAI, Azure OpenAI, or any compatible endpoint
    • Support for various models (GPT-4, Claude, Llama)
    • Configurable model settings

🚀 Quick Start

Using Docker

TIP

Please include the volume mount -v Centroid_data:/app/data in your Docker command. It's crucial for persisting your database, preventing data loss between container restarts.

Visit http://localhost:3000 to access the web interface. Visit http://localhost:8000/docs to access the backend OpenAPI documentation.

docker run -d -p 3000:3000 -p 8000:8000 -v Centroid_data:/app/data -e LLM_BASE_URL=https://api.openai.com/v1 -e LLM_API_KEY=your_api_key -e LLM_DEFAULT_MODEL=gpt-4o-mini -e FIRST_SUPERUSER=admin@example.com -e FIRST_SUPERUSER_PASSWORD=example123 --name Centroid --restart always ghcr.io/srikanth235/Centroid:main

Environment Variables

# LLM Configuration
LLM_BASE_URL=https://api.openai.com/v1      # OpenAI API compatible custom endpoint
LLM_API_KEY=your_api_key                    # Your OpenAI compatible API key
LLM_DEFAULT_MODEL=gpt-4o-mini              # Default model to use

# Authentication
FIRST_SUPERUSER=admin@example.com          # Default: admin@example.com
FIRST_SUPERUSER_PASSWORD=example123        # Default: example123

View all environment variables →

🤖 Choosing an LLM

Centroid requires an LLM with function/tool calling capabilities. We recommend using any of these tested models:

  • GPT-4o-mini
  • Claude Haiku
  • Llama 3.2 (70B)

Any models at least as powerful as the ones listed above will work well with Centroid. Models without tool calling capabilities may have limited functionality.

📊 Telemetry

Centroid includes optional telemetry to help improve the platform. This feature:

  • Is enabled by default and requires explicit opt-out
  • Only tracks API usage patterns, never sensitive data
  • Helps us understand how features are used and identify performance issues

What We Track

When enabled, Centroid tracks:

  • Chat API interactions (create/update/delete operations)
  • Basic request metrics (duration, status codes)
  • Anonymous usage patterns
  • Performance indicators

NOTE

You can view the exact events we track in our analytics implementation.

Configuration

Control telemetry through environment variables:

# Enable/disable telemetry
TELEMETRY_ENABLED=false

Privacy Considerations

  • No personal data or chat content is ever collected
  • All tracking is anonymous
  • Performance metrics are aggregated
  • You can self-host without any external analytics

🌟 Contributing

We love contributions! Here's how to get started:

Development Setup

  1. Clone the repository

    git clone https://github.com/Centroid/Centroid.git
    cd Centroid
    
  2. Install dependencies

    # Frontend
    pnpm install
    
    # Backend
    poetry install
    
  3. Start development servers

    ./start.sh
    

Making Changes

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

🤝 Support & Community

Need help? Join our community:

🙏 Credits

The initial foundation of this project was built using these excellent open-source boilerplate projects:

📄 License

Centroid is Apache 2.0 licensed.


Made with ❤️ by humans and AI

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
CursorThe AI Code Editor
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
ChatWiseThe second fastest AI chatbot™
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
WindsurfThe new purpose-built IDE to harness magic
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Playwright McpPlaywright MCP server
DeepChatYour AI Partner on Desktop
Tavily Mcp
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Amap Maps高德地图官方 MCP Server
Serper MCP ServerA Serper MCP Server
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.