Sponsored by Deepsite.site

Bigbugai

Created By
bigbugAi3 months ago
MCP server exposing BigBugAI tools for trending tokens and token analysis.
Content

BigBugAI MCP (Python + FastMCP) CI License: MIT

Production-ready MCP server exposing BigBugAI tools with stdio transport for local MCP clients (Claude Desktop, Cursor) and optional HTTP/SSE transport (FastAPI + uvicorn) for remote access.

Auth via API key (env var) Per-key rate limiting (moving window; configurable via env) Typed Pydantic schemas for tool I/O Clean error handling and JSON-stable outputs Tests, ruff, and mypy configuration Tools get_trending_tokens(limit: int = 10) -> list[dict] GET ${BTUNIFIED_API}/api/tokens/newly-ingested by default Falls back to ${BTUNIFIED_API}/v1/trending/tokens and a few other candidates if 404 Override primary path with BTUNIFIED_TRENDING_PATH Normalizes {items: [...]} -> [...] token_analysis_by_contract(chain: str, address: str) -> dict GET ${BTUNIFIED_API}/api/token-intel/{chain}/{address}/report Requirements Python 3.11+ Packages: mcp[cli], httpx, pydantic, fastapi, uvicorn, limits, pytest, ruff, mypy Environment variables BIGBUGAI_MCP_API_KEY (required) BIGBUGAI_API_KEY / BIGBUGAI_API_TOKEN (optional; used for upstream HTTP calls if set. If not set, HTTP calls will fall back to BIGBUGAI_MCP_API_KEY.) BTUNIFIED_API (default: https://api.bigbug.ai) MCP_RATE_LIMIT (default: 60/hour, rate string per limits) Install Using uv (recommended):

Unix/macOS uv venv source .venv/bin/activate uv pip install -e .[dev] Windows PowerShell uv venv .venv\Scripts\Activate.ps1 uv pip install -e .[dev] Alternatively, you can explicitly add packages (will update pyproject as needed):

uv add "mcp[cli]" httpx pydantic fastapi uvicorn limits pytest ruff mypy Run (STDIO) export BIGBUGAI_MCP_API_KEY="your-secret" export BTUNIFIED_API="https://api.bigbug.ai" uv run -m bigbugai_mcp.server_stdio Windows PowerShell:

$env:BIGBUGAI_MCP_API_KEY="your-secret" $env:BTUNIFIED_API="https://api.bigbug.ai" uv run -m bigbugai_mcp.server_stdio This mode is intended for local MCP clients (e.g., Claude Desktop, Cursor).

Note: Tools no longer require api_key in the payload. The server reads the API key from the environment (BIGBUGAI_MCP_API_KEY) and applies rate limiting based on it.

Claude Desktop config Create claude_desktop_config.json:

{ "mcpServers": { "bigbugai": { "command": "uv", "args": ["-m", "bigbugai_mcp.server_stdio"], "env": { "BIGBUGAI_MCP_API_KEY": "your-secret", "BTUNIFIED_API": "https://api.bigbug.ai", "MCP_RATE_LIMIT": "60/hour" } } } } Run (HTTP) uv run -m bigbugai_mcp.server_http

server on :8000

curl -s http://localhost:8000/healthz Expected output:

ok MCP HTTP/SSE endpoints are mounted under /mcp. Depending on your FastMCP version, an SSE stream may be available at /mcp/sse.

Example cURL (SSE; may require -N to keep the connection open and is primarily for debugging):

curl -N http://localhost:8000/mcp/sse Note: MCP over HTTP/SSE is designed for compatible clients; manual cURL interaction is limited.

Smoke scripts Quick sanity checks for the tools (require an API key in env):

Trending

uses BIGBUGAI_API_KEY/BIGBUGAI_API_TOKEN/BIGBUGAI_MCP_API_KEY from env

uv run python scripts/smoke_trending.py -l 5 Token analysis

requires BIGBUGAI_MCP_API_KEY in env; optionally set BB_CHAIN/BB_ADDRESS

$env:BIGBUGAI_MCP_API_KEY="your-secret" # PowerShell uv run python scripts/smoke_token_analysis.py Testing and Quality

Run unit tests

uv run pytest -q

Lint

uv run ruff check .

Type check

uv run mypy src Project layout bigbugai-mcp/ .github/workflows/ci.yml CODE_OF_CONDUCT.md CONTRIBUTING.md LICENSE README.md SECURITY.md pyproject.toml scripts/list_tools.py scripts/smoke_token_analysis.py scripts/smoke_trending.py src/bigbugai_mcp/init.py src/bigbugai_mcp/auth.py src/bigbugai_mcp/models.py src/bigbugai_mcp/server_http.py src/bigbugai_mcp/server_stdio.py src/bigbugai_mcp/tools.py tests/test_tools.py Security Rotate API keys regularly Keep HTTP mode behind OAuth/reverse proxy if exposed publicly Rate limits are per API key in a moving window strategy See SECURITY.md for reporting vulnerabilities.

Extending Add more tools for BigBugAI endpoints (portfolio manager, investment suggester, etc.).

Add new Pydantic request/response models in src/bigbugai_mcp/models.py Add the tool function in src/bigbugai_mcp/tools.py Decorate with @guarded and register in register_tools() Write tests in tests/ Contributing Please see CONTRIBUTING.md for guidelines.

Code of Conduct This project follows the Contributor Covenant.

License MIT — see LICENSE.

Server Config

{
  "mcpServers": {
    "bigbugai": {
      "command": "uv",
      "args": [
        "-m",
        "bigbugai_mcp.server_stdio"
      ],
      "env": {
        "BIGBUGAI_MCP_API_KEY": "your-secret",
        "BTUNIFIED_API": "https://api.bigbug.ai",
        "MCP_RATE_LIMIT": "60/hour"
      }
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Tavily Mcp
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
DeepChatYour AI Partner on Desktop
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Playwright McpPlaywright MCP server
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
CursorThe AI Code Editor
Serper MCP ServerA Serper MCP Server
WindsurfThe new purpose-built IDE to harness magic
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Amap Maps高德地图官方 MCP Server
ChatWiseThe second fastest AI chatbot™