- aiowhitebit-mcp
aiowhitebit-mcp
Message Control Protocol (MCP) server and client implementation for WhiteBit cryptocurrency exchange API. Built on top of [aiowhitebit](
Content
aiowhitebit-mcp
Message Control Protocol (MCP) server and client implementation for WhiteBit cryptocurrency exchange API. Built on top of aiowhitebit library and fastmcp.
Features
- MCP server for WhiteBit API with public endpoints
- Support for multiple transport protocols (stdio, SSE, WebSocket)
- Easy-to-use client for interacting with the MCP server
- Command-line interface for running the server
- Integration with Claude Desktop
- Real-time market data via WebSocket
- Comprehensive test coverage and type checking
- Modern development tools (ruff, pyright, pre-commit)
- Caching with disk persistence
- Rate limiting and circuit breaker patterns
Quick Start
# Install the package
pip install aiowhitebit-mcp
# Run the server (stdio transport for Claude Desktop)
aiowhitebit-mcp --transport stdio
# Or run with SSE transport
aiowhitebit-mcp --transport sse --host 127.0.0.1 --port 8000
Basic Usage
Client with Network Transport
import asyncio
import os
from aiowhitebit_mcp.client import WhiteBitMCPClient
async def main():
# Set the server URL (or use environment variable)
server_url = "http://localhost:8000/sse"
os.environ["WHITEBIT_MCP_URL"] = server_url
async with WhiteBitMCPClient() as client:
# Get market info
btc_usdt = await client.get_market_resource("BTC_USDT")
print("BTC/USDT Market Info:", btc_usdt)
# Get real-time price via WebSocket
price = await client.get_last_price("BTC_USDT")
print("Current BTC/USDT price:", price)
# Get order book
orderbook = await client.get_orderbook("BTC_USDT")
print("Order book:", orderbook)
if __name__ == "__main__":
asyncio.run(main())
Server Configuration
from aiowhitebit_mcp.server import create_server
import asyncio
# Create the server with custom configuration
server = create_server(
name="WhiteBit API"
)
# Run the server with desired transport
if __name__ == "__main__":
asyncio.run(
server.run(
transport="stdio", # or "sse"
host="127.0.0.1", # for network transports
port=8000 # for network transports
)
)
Available Tools
Public API
get_server_time(): Get current server timeget_market_info(): Get all markets informationget_orderbook(market: str): Get order bookget_recent_trades(market: str, limit: int = 100): Get recent tradesget_ticker(market: str): Get ticker informationget_fee(market: str): Get trading feesget_server_status(): Get server statusget_asset_status_list(): Get status of all assets
WebSocket API
get_last_price(market: str): Get real-time priceget_market_depth(market: str): Get real-time order book
Resources
whitebit://markets: Get all markets informationwhitebit://markets/{market}: Get specific market informationwhitebit://assets: Get all assets informationwhitebit://assets/{asset}: Get specific asset information
Command-line Interface
# Show help
aiowhitebit-mcp --help
# Run with stdio transport (for Claude Desktop)
aiowhitebit-mcp --transport stdio
# Run with SSE transport
aiowhitebit-mcp --transport sse --host localhost --port 8000
Development
# Clone the repository
git clone https://github.com/yourusername/aiowhitebit-mcp.git
cd aiowhitebit-mcp
# Install development dependencies
pip install -e ".[dev]"
# Install pre-commit hooks
pre-commit install
# Run tests
pytest
# Run type checking
pyright src/aiowhitebit_mcp
# Run linting
ruff check .
Examples
Check the examples/ directory for more usage examples:
claude_desktop_server.py: Run the server with stdio transport for Claude Desktopclaude_desktop_client.py: Client for connecting to a stdio serversse_server.py: Run the server with SSE transportsse_client.py: Client for connecting to an SSE server
License
Apache License 2.0
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
DeepChatYour AI Partner on Desktop
Amap Maps高德地图官方 MCP Server
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Tavily Mcp
ChatWiseThe second fastest AI chatbot™
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题;
Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
WindsurfThe new purpose-built IDE to harness magic
Playwright McpPlaywright MCP server
Serper MCP ServerA Serper MCP Server
CursorThe AI Code Editor
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code