Sponsored by Deepsite.site

Hedera MCP Server

Created By
hashgraph-online7 months ago
Content

Hedera MCP Server

A production-ready Model Context Protocol (MCP) server that brings Hedera hashgraph operations to AI agents and LLMs through natural language. Execute transactions, check balances, manage tokens, and more - all by simply describing what you want to do in plain English.

🚀 New to MCP? The Model Context Protocol lets AI assistants like Claude interact with external systems. This server makes Hedera network operations available to any MCP-compatible AI tool. It will also support HCS-10 for direct agent-to-agent communication on Hedera (coming soon).

🎉 What's New

  • 50% Smaller Dependencies: Using pnpm reduced node_modules from 15GB to ~8GB
  • Zero-Config Setup: Automatic database initialization - just run and go!
  • Better Docker Support: Optimized images with separate dev/prod configurations
  • Hot Reload Everything: Changes apply instantly without restarts
  • Analytics Dashboard: Track tool usage and credit consumption with beautiful charts

✨ What Can You Do?

  • Natural Language Transactions: "Transfer 5 HBAR from my account to 0.0.123456"
  • Token Operations: "Create a new token called MyToken with symbol MTK"
  • Smart Contracts: "Deploy a contract and call the mint function"
  • Account Management: "Check my account balance and transaction history"
  • Scheduled Transactions: "Schedule a transfer for next week"
  • Credit System: Pay-per-use model with HBAR payments
  • Secure Authentication: API key-based auth with Hedera signature verification, managed via MCP tools.
  • HCS-10 Agent Registration: Register and operate as an HCS-10 compliant agent.

🚀 Quick Start (2 Minutes)

Docker Setup (Fastest - Zero Config)

# Clone the repository
git clone https://github.com/hashgraph-online/hedera-mcp-server.git
cd hedera-mcp-server

# Configure your environment
cp env.example .env
# Edit .env with your Hedera credentials (see Configuration section below)

# Start everything with Docker
npm run docker:dev:full

🎉 That's it! Everything is running with automatic database setup, optimized dependencies, and hot reload.

Local Development Setup

# Clone and install (using pnpm for efficient dependency management)
git clone https://github.com/hashgraph-online/hedera-mcp-server.git
cd hedera-mcp-server
npm install -g pnpm@9.14.4  # Install pnpm if you don't have it
pnpm install                # Fast install with deduplication

# Configure environment
cp env.example .env
# Edit .env with your credentials

# Start development (auto-initializes database)
pnpm run dev:full

Configuration

Edit the .env file with your Hedera credentials:

# Required: Your Hedera account (get free testnet HBAR at portal.hedera.com)
HEDERA_OPERATOR_ID=0.0.YOUR_ACCOUNT_ID
HEDERA_OPERATOR_KEY=your-private-key-here

# Required: Server account for HBAR payments and HCS-10 Identity
SERVER_ACCOUNT_ID=0.0.YOUR_SERVER_ACCOUNT_ID
SERVER_PRIVATE_KEY=your_server_private_key_here

# Required: OpenAI API key for natural language processing
OPENAI_API_KEY=sk-your-openai-key-here

# Optional: Network (testnet is default)
HEDERA_NETWORK=testnet

# Optional: Authentication (enabled by default)
REQUIRE_AUTH=true
API_KEY_ENCRYPTION_KEY=your-strong-secret-key # Required for production

Service Endpoints

Once running, your services are available at:

  • MCP Server (SSE): http://localhost:3000/stream - For AI assistants like Claude
  • MCP Server (JSON-RPC): http://localhost:3000/ - For HTTP POST requests
  • Admin Portal: http://localhost:3001 - Web UI for managing credits and users
  • Metrics API: http://localhost:3003/metrics - Prometheus metrics

🔧 Available Tools & Commands

The server provides a suite of powerful MCP tools for Hedera operations. These are accessed via an MCP client connected to the main MCP server on port 3000 (http://localhost:3000/ for JSON-RPC or http://localhost:3000/stream for SSE). Key tools include:

🏥 Health & System

  • health_check: Check server status, connectivity, and HCS-10 registration status.
  • get_server_info: Get server configuration, capabilities, and list of available tools.
  • refresh_profile: (If HCS-10 enabled) Update server's HCS-10 profile registration on Hedera.

💰 Transaction Operations

  • execute_transaction: Execute any Hedera transaction based on a natural language request (requires OpenAI).
    {
      "name": "execute_transaction",
      "arguments": {
        "request": "Transfer 10 HBAR to 0.0.123456 with memo 'payment'"
      }
    }
    
  • schedule_transaction: Schedule transactions for future execution based on a natural language request (requires OpenAI).
    {
      "name": "schedule_transaction",
      "arguments": { "request": "Schedule a token transfer for next Friday" }
    }
    
  • generate_transaction_bytes: Generate unsigned transaction bytes from a natural language request (requires OpenAI).
    {
      "name": "generate_transaction_bytes",
      "arguments": {
        "request": "Create transaction bytes for deploying a smart contract"
      }
    }
    
  • execute_query: Perform read-only queries (balances, token info, NFT details, etc.) using natural language (requires OpenAI).
    {
      "name": "execute_query",
      "arguments": { "request": "Get my account balance and token list" }
    }
    

💳 Credit & Payment System

  • purchase_credits: Initiates a credit purchase, returning an unsigned HBAR transfer transaction.
  • verify_payment: Verifies an HBAR payment transaction and allocates credits to the user's account.
  • check_payment_status: Checks the status of a payment transaction.
  • get_payment_history: Retrieves the payment history for an account.
  • check_credit_balance: Checks the current credit balance for an account.
  • get_credit_history: Retrieves the credit usage history for an account.
  • process_hbar_payment: (Admin) Manually process HBAR payments and allocate credits.

🔐 Authentication Tools (Called via MCP on Port 3000)

  • request_auth_challenge: Requests an authentication challenge for a Hedera account to initiate API key generation.
  • verify_auth_signature: Verifies a signed challenge (provided by request_auth_challenge) and returns an API key.
  • get_api_keys: Views active API keys for the authenticated account.
  • rotate_api_key: Creates a new API key and revokes the old one for the authenticated account.
  • revoke_api_key: Permanently disables a specific API key for the authenticated account.

⚙️ Configuration & Management

  • get_pricing_configuration: View current operation costs and pricing tiers.

(Note: The get_server_info tool provides the most up-to-date list of tools and their schemas.)

📖 Usage Examples

With Claude Desktop (MCP Client using stdio)

Ensure MCP_TRANSPORT=stdio in your .env when configuring for Claude Desktop. Add to your Claude Desktop config (claude_desktop_config.json):

{
  "mcpServers": {
    "hedera": {
      "command": "node",
      "args": ["/path/to/hedera-mcp-server/dist/index.js"], // Adjust path as needed
      "env": {
        "HEDERA_OPERATOR_ID": "0.0.YOUR_ACCOUNT",
        "HEDERA_OPERATOR_KEY": "your-private-key",
        "SERVER_ACCOUNT_ID": "0.0.YOUR_SERVER_ACCOUNT",
        "SERVER_PRIVATE_KEY": "your_server_private_key",
        "OPENAI_API_KEY": "sk-your-key", // OpenAI key is required
        "MCP_TRANSPORT": "stdio", // Important for Claude Desktop
        "REQUIRE_AUTH": "false" // For ease of local development with Claude Desktop
      }
    }
  }
}

Then in Claude Desktop:

You: "Check my Hedera account balance on testnet using account 0.0.YOUR_ACCOUNT"
Claude: [Uses health_check and execute_query tools] "Your account 0.0.YOUR_ACCOUNT has X.Y HBAR"

You: "Transfer 5 HBAR to 0.0.789012 with memo 'coffee payment' from 0.0.YOUR_ACCOUNT"
Claude: [Uses execute_transaction] "Transaction successful! TX ID: 0.0.YOUR_ACCOUNT@timestamp"

Via MCP Protocol (HTTP JSON-RPC with Authentication)

API keys are obtained by calling MCP tools on the main server (port 3000).

First, get an API key:

# 1. Request challenge (using MCP tool via JSON-RPC POST to root path on port 3000)
curl -X POST http://localhost:3000/ \
  -H "Content-Type: application/json" \
  -d '{
    "jsonrpc": "2.0",
    "id": "req-challenge-1",
    "method": "tools/call",
    "params": {
      "name": "request_auth_challenge",
      "arguments": {
        "hederaAccountId": "0.0.123456"
      }
    }
  }'
# This will return a JSON object with `result` containing a JSON string. Parse `result` to get:
# { "challengeId": "...", "challenge": "...", "expiresAt": "..." }

# 2. Sign the `challenge` string from the response with your Hedera private key.

# 3. Verify signature and get API key (using MCP tool via JSON-RPC POST to root path on port 3000)
curl -X POST http://localhost:3000/ \
  -H "Content-Type: application/json" \
  -d '{
    "jsonrpc": "2.0",
    "id": "req-verify-1",
    "method": "tools/call",
    "params": {
      "name": "verify_auth_signature",
      "arguments": {
        "challengeId": "the-challenge-id-from-step-1-response",
        "signature": "your-hex-encoded-signature-of-the-challenge"
      }
    }
  }'
# The `result` in the response (a JSON string) will contain your API key if successful:
# { "apiKey": "...", "accountId": "...", ... }

Then use the API key for subsequent MCP requests to the main server (port 3000):

# Execute transaction (example using JSON-RPC POST to root path)
curl -X POST http://localhost:3000/ \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer your-api-key-from-previous-step" \
  -d '{
    "jsonrpc": "2.0",
    "id": "req-mcp-tx-1",
    "method": "tools/call",
    "params": {
      "name": "execute_transaction",
      "arguments": {
        "request": "Transfer 1 HBAR to 0.0.123456",
        "accountId": "0.0.123456" // Account making the request (should match API key or be authorized)
      }
    }
  }'

With Admin Portal

Ensure the server is running with npm run dev:full.

Visit:

  • Admin Portal: http://localhost:3001 (web interface for credit management, user lookup, etc. Check ADMIN_PORTAL_URL in .env)
  • MCP Server (SSE): http://localhost:3000/stream
  • MCP Server (JSON-RPC via POST): http://localhost:3000/
  • Metrics API: http://localhost:3003/metrics (Prometheus metrics if ENABLE_METRICS=true. Check AUTH_API_PORT in .env)
  • MCP Inspector: URL and port for the MCP inspector can be configured via NEXT_PUBLIC_MCP_INSPECTOR_URL and MCP_INSPECTOR_PORT in .env. Default is often http://127.0.0.1:6274 if enabled.

🔐 Authentication & Credits

API Key Authentication

When REQUIRE_AUTH=true (default), an API key is needed for most MCP tool calls to the server on port 3000.

  1. Request Challenge: Use the request_auth_challenge MCP tool (via port 3000) with your Hedera Account ID.
  2. Sign Challenge: Sign the received challenge string using the private key associated with your Hedera Account ID.
  3. Verify & Get API Key: Use the verify_auth_signature MCP tool (via port 3000) with the challengeId and your hex-encoded signature. If successful, an API key is returned.
  4. Use API Key: Include the API key in the Authorization: Bearer your-api-key header for MCP requests to the main server (port 3000).

The API_KEY_ENCRYPTION_KEY in your .env file is critical for production. It encrypts API keys stored in the database. Set this to a strong, unique secret.

Credit System

Operations consume credits, purchased with HBAR. Natural language processing tools (like execute_transaction, execute_query) require OpenAI and will have associated credit costs.

Operation CategoryCost RangeExamples
Free Operations0 creditshealth_check, get_server_info, check_credit_balance (for self)
Auth Operations0-5 creditsrequest_auth_challenge, verify_auth_signature
Basic Operations2-10 creditsrefresh_profile, get_api_keys, get_payment_history
Standard Queries10-50 creditsexecute_query (cost varies by complexity, requires OpenAI)
Transactions50+ creditsexecute_transaction, schedule_transaction (cost varies, requires OpenAI)

Dynamic Pricing: Base rate: CREDITS_CONVERSION_RATE (e.g., 1000) credits per 1 USD worth of HBAR. The HBAR cost for credits fluctuates based on the HBAR/USD exchange rate. Operation costs are in src/db/seed-data/default-pricing-tiers.ts.

Purchase credits:

  1. Call purchase_credits tool: Provide payerAccountId, amountInHbar, and optionally beneficiaryAccountId and memo. It returns unsignedTransactionBytes for an HBAR transfer.
    {
      "name": "purchase_credits",
      "arguments": {
        "payerAccountId": "0.0.YOUR_PAYING_ACCOUNT",
        "amountInHbar": 10.0,
        "beneficiaryAccountId": "0.0.YOUR_BENEFICIARY_ACCOUNT"
      }
    }
    
  2. Sign and submit this transaction to the Hedera network.
  3. Call verify_payment tool: Provide the transactionId of your HBAR payment. Credits are then allocated.
    {
      "name": "verify_payment",
      "arguments": {
        "transactionId": "0.0.YOUR_PAYING_ACCOUNT@timestamp.nanoseconds"
      }
    }
    

🗄️ Database Support

SQLite (Default - Perfect for Development)

Used by default with npm run dev or npm run dev:full if DATABASE_URL points to a SQLite file.

# .env setting for SQLite:
DATABASE_URL=sqlite://./data/credits.db

For production or team development.

# Option 1: Docker (using provided docker-compose files like docker-compose.postgres.yml or part of dev:full)
# docker-compose.yml might include a PostgreSQL service.
# Example command if using a dedicated postgres compose file:
# docker compose -f docker-compose.postgres.yml up -d

# Option 2: Local or Hosted PostgreSQL
# Update .env to point to your PostgreSQL instance:
# DATABASE_URL=postgresql://user:pass@localhost:5432/hedera_mcp_db
# Then run the server (e.g., `npm run dev:pg` or `npm run dev:full` which might use this .env config)

# npm run dev:pg (script to specifically target PostgreSQL if available)

Database Commands

npm run db:setup      # Initialize database (creates tables based on schema, seeds data)
npm run db:seed       # Reload seed data (pricing tiers, etc.)
npm run db:clear      # Clear all data from the database (use with caution!)
npm run db:push       # (Drizzle Kit) Push schema changes from src/db/schema.ts to the database
npm run db:studio     # (Drizzle Kit) Open Drizzle Studio (web GUI for database management)
npm run migrate       # Run pending migrations based on files in drizzle migration folder
npm run migrate:generate # Generate new migration files based on schema changes

🧪 Testing

Run Tests

npm test                    # Run all Jest tests (unit and some integration)
npm run test:integration    # Run integration tests (requires DB and Hedera connection)
npm run test:auth           # Run authentication specific tests
npm run test:coverage       # Run tests and generate a coverage report

Test Requirements

For integration tests, ensure .env is configured with valid Hedera testnet credentials and OPENAI_API_KEY:

# In your .env file or as environment variables:
HEDERA_OPERATOR_ID=0.0.YOUR_TESTNET_ACCOUNT
HEDERA_OPERATOR_KEY=YOUR_TESTNET_PRIVATE_KEY
SERVER_ACCOUNT_ID=0.0.YOUR_SERVER_TESTNET_ACCOUNT
SERVER_PRIVATE_KEY=YOUR_SERVER_TESTNET_PRIVATE_KEY
OPENAI_API_KEY=sk-YOUR_VALID_OPENAI_KEY

Test Specific Features

(Refer to package.json scripts section for the most up-to-date list of specific test commands like test:credits, test:mcp-e2e, etc.)

# Example: Test credit system (integration)
npm run test:credits

# Enable debug output for tests:
DEBUG=* npm test

🐳 Docker Support

The Docker development environment provides the best developer experience with automatic database setup, optimized dependencies, and hot reload:

# Start full development stack (recommended)
npm run docker:dev:full

# This single command:
# ✅ Builds optimized Docker images with pnpm
# ✅ Automatically initializes the database
# ✅ Starts MCP server with hot reload
# ✅ Starts Admin portal with hot reload
# ✅ Mounts your local code for instant updates
# ✅ Sets up proper networking between services

# View logs
npm run docker:logs

# Access container shell for debugging
npm run docker:shell

# Stop everything
npm run docker:down

Production Deployment

# Configure for production
cp env.example .env.production
# Edit .env.production with production values

# Deploy with Docker Compose
npm run docker:prod

# Or with optional services
docker compose -f docker-compose.prod.yml --profile with-postgres up -d
docker compose -f docker-compose.prod.yml --profile with-redis up -d

# View production logs
npm run docker:prod:logs

Docker Features

  • Multi-stage builds: Separate development and production images
  • pnpm optimization: Reduces image size from 15GB to ~8GB
  • Automatic database setup: No manual initialization needed
  • Health checks: Built-in container health monitoring
  • PM2 for production: Process management with auto-restart
  • Volume management: Persistent data with proper permissions

⚡ Development Workflows

Choose Your Setup

npm run docker:dev:full   # Everything starts automatically

💻 Local Development (More Control)

pnpm install              # Fast install with deduplication
pnpm run dev:full         # Auto-initializes database and starts all services

🎯 Minimal Setup (MCP Server Only)

pnpm run dev              # Just the MCP server with SQLite

What Gets Started

The dev:full command (both Docker and local) starts:

  • MCP Server (Port 3000) - Handles all Hedera operations
  • Admin Portal (Port 3001) - Web UI for managing credits and users
  • Metrics API (Port 3003) - Prometheus metrics endpoint
  • Database - Auto-initialized SQLite or PostgreSQL
  • File Watchers - Hot reload on code changes

Developer Experience Features

🚀 Automatic Database Setup

No more manual database initialization! The development scripts automatically:

  • Create the database if it doesn't exist
  • Run all migrations
  • Seed initial pricing data
  • Set up proper permissions

📦 Optimized Dependencies with pnpm

  • Reduced node_modules from 15GB to ~8GB (50% reduction)
  • Faster installs with aggressive deduplication
  • Workspace support for monorepo structure
  • Shared dependencies between server and admin portal

🔄 Hot Reload Everything

  • Server code changes reload instantly
  • Admin portal updates without restart
  • Database schema changes apply on save
  • No need to restart containers

🛠️ Useful Development Commands

# Database Management
pnpm run db:studio        # Visual database browser
pnpm run db:seed          # Reload pricing data
pnpm run db:clear         # Clear all data (careful!)

# Code Quality
pnpm run typecheck        # Check TypeScript types
pnpm run lint             # Run ESLint
pnpm run format           # Format with Prettier

# Testing
pnpm test                 # Run all tests
pnpm run test:watch       # Run tests in watch mode
pnpm run test:integration # Integration tests only

# Docker Helpers
npm run docker:logs       # View container logs
npm run docker:shell      # Access container shell
npm run docker:clean      # Remove everything and start fresh

🔧 Configuration Reference

Key Environment Variables (from env.example)

VariableDefault (env.example)Description
DATABASE_URLpostgresql://mcpuser:mcppass@localhost:5432/mcpserverDatabase connection string (supports sqlite: or postgresql:)
HEDERA_NETWORKtestnetHedera network (testnet or mainnet)
HEDERA_OPERATOR_ID0.0.123456Your primary Hedera account ID (for operations, fee payments if not overridden)
HEDERA_OPERATOR_KEYyour_operator_private_key_herePrivate key for HEDERA_OPERATOR_ID
SERVER_ACCOUNT_ID0.0.789012Server's Hedera account for receiving payments and HCS-10 identity. Must be funded.
SERVER_PRIVATE_KEYyour_server_account_private_key_herePrivate key for SERVER_ACCOUNT_ID.
OPENAI_API_KEY(must be set)Required. OpenAI API key for natural language processing features.
ENABLE_HCS10trueEnable HCS-10 agent registration and functionality.
AGENT_NAMEHedera MCP ServerName for HCS-10 agent profile.
MCP_TRANSPORTbothMCP transport mode (http, sse, stdio, both). both enables HTTP and SSE for the main server.
PORT3000Main server port for MCP (HTTP/SSE at / and /stream).
REQUIRE_AUTHtrueEnable API key authentication for MCP tools.
API_KEY_ENCRYPTION_KEY(must be set for prod)CRITICAL for Production: Secret key for encrypting API keys. Set a strong random key.
AUTH_API_PORT3003Port for the separate Metrics API only (/metrics).
CREDITS_CONVERSION_RATE1000Base credits per 1 USD equivalent of HBAR (used for dynamic pricing).
LOG_LEVELinfoLogging level (debug, info, warn, error).
ADMIN_PORTAL_URLhttp://localhost:3001URL for the admin portal.

Refer to env.example for a comprehensive list of all available environment variables and their descriptions.

Database URLs

# SQLite (development, testing)
DATABASE_URL=sqlite://./data/credits.db
DATABASE_URL=sqlite://:memory: # In-memory, for some tests

# PostgreSQL (production, staging)
# Example for default docker-compose PostgreSQL service named 'db':
DATABASE_URL=postgresql://mcpuser:mcppass@db:5432/mcpserver
# For external PostgreSQL:
# DATABASE_URL=postgresql://user:password@your-postgres-host:port/your-database-name

Authentication Configuration

  • REQUIRE_AUTH=true: Enables API key authentication for MCP tool calls on the main server (port 3000).
  • API_KEY_ENCRYPTION_KEY: Must be set to a strong secret in production. Used to encrypt API keys at rest.
  • Authentication flow (challenge, signature verification, key issuance) is handled via MCP tools like request_auth_challenge and verify_auth_signature on port 3000.
  • The server on AUTH_API_PORT (default 3003) is only for Prometheus metrics at the /metrics endpoint.

Optional Redis integration for rate limiting and anomaly detection (see env.example for REDIS_URL).

🚨 Troubleshooting

Quick Fixes

Database Issues

# Database not initializing? Run setup manually:
pnpm run db:setup

# Schema out of sync? Push changes:
pnpm run db:push

Port Conflicts

# Change ports in .env:
PORT=3005              # MCP Server
ADMIN_PORT=3002        # Admin Portal
AUTH_API_PORT=3004     # Metrics API

Docker Issues

# Complete reset:
npm run docker:clean
npm run docker:dev:full

# Can't access services?
docker compose -f docker-compose.dev.yml ps  # Check status
npm run docker:logs                          # View logs

Dependency Issues

# Clear pnpm cache and reinstall:
pnpm store prune
rm -rf node_modules pnpm-lock.yaml
pnpm install

Common Error Messages

ErrorSolution
"Database not found"Run pnpm run db:setup
"Port already in use"Change port in .env or kill process
"Invalid private key"Check Hedera key format in .env
"INSUFFICIENT_ACCOUNT_BALANCE"Fund account on testnet portal
"OPENAI_API_KEY required"Add OpenAI key to .env
"Authentication required"Set REQUIRE_AUTH=false for dev

Debug Mode

# Enable debug logging
LOG_LEVEL=debug npm run dev:full

# Debug specific components
DEBUG=fastmcp* npm run dev:full
DEBUG=drizzle* npm run dev:full

Complete Reset

# Nuclear option - removes everything:
npm run docker:clean      # If using Docker
rm -rf node_modules dist data/*.db
pnpm install
pnpm run dev:full

📚 API Reference

MCP Tools Reference

Tools are called via an MCP client using JSON-RPC (POST to http://localhost:3000/) or SSE (connected to http://localhost:3000/stream). Use the get_server_info tool to get a live list of tools and their Zod schemas for parameters.

Examples of key tool schemas (parameters are in `arguments` object)

execute_transaction (Natural Language to Transaction - Requires OpenAI)

{
  "name": "execute_transaction",
  "arguments": {
    "request": "Transfer 5 HBAR from my account to 0.0.123456 and set memo 'payment for goods'",
    "accountId": "0.0.YOUR_ACTING_ACCOUNT" // Optional: Account context for the transaction
  }
}

execute_query (Natural Language to Query - Requires OpenAI)

{
  "name": "execute_query",
  "arguments": {
    "request": "What is the balance of account 0.0.98765?",
    "accountId": "0.0.YOUR_CALLING_ACCOUNT" // Optional: Account context
  }
}

request_auth_challenge (Get challenge for API key generation)

{
  "name": "request_auth_challenge",
  "arguments": {
    "hederaAccountId": "0.0.123456" // Hedera account ID requesting authentication
  }
}

verify_auth_signature (Verify signed challenge, get API key)

{
  "name": "verify_auth_signature",
  "arguments": {
    "challengeId": "the-uuid-challenge-id-received",
    "signature": "hex-encoded-signature-of-the-challenge-string"
  }
}

purchase_credits (Initiate credit purchase)

{
  "name": "purchase_credits",
  "arguments": {
    "payerAccountId": "0.0.PAYER_ACCOUNT",
    "amountInHbar": 10.5,
    "beneficiaryAccountId": "0.0.BENEFICIARY_ACCOUNT", // Optional, defaults to payer
    "memo": "Credits purchase" // Optional
  }
}

check_credit_balance

{
  "name": "check_credit_balance",
  "arguments": {
    // "accountId": "0.0.TARGET_ACCOUNT" // Optional, defaults to authenticated API key's account.
  }
}

HTTP API Endpoints

Main MCP Server (Port: 3000 or as per PORT in .env)

  • JSON-RPC MCP Endpoint
    • URL: / (e.g., http://localhost:3000/)
    • Method: POST
    • Headers: Content-Type: application/json, Authorization: Bearer your-api-key (if REQUIRE_AUTH=true)
    • Body: Standard JSON-RPC 2.0 for tools/call (see MCP client examples).
  • SSE (Server-Sent Events) MCP Endpoint
    • URL: /stream (e.g., http://localhost:3000/stream)
    • Method: GET
    • Authentication: API key often passed as query parameter ?apiKey=your-api-key or handled by initial handshake, depending on FastMCP client/server.
  • Health Check
    • URL: /health (e.g., http://localhost:3000/health)
    • Method: GET
    • Response: {"status": "healthy", ...}

Metrics API Server (Port: 3003 or as per AUTH_API_PORT in .env)

  • /metrics: GET. Prometheus-formatted metrics (if ENABLE_METRICS=true).

Note: All authentication operations (requesting challenges, verifying signatures, managing API keys) are handled via MCP tools on the main server (port 3000).

🤝 Contributing

We welcome contributions! Please see the CONTRIBUTING.md file for detailed guidelines on how to contribute to this project, including development standards, testing procedures, and how to submit pull requests.

📄 License

This project is licensed under the Apache License 2.0. See the LICENSE file for details, or view the full license text at http://www.apache.org/licenses/LICENSE-2.0.


Made with ❤️ by the Hashgraph Online

Bringing Hedera to AI, one conversation at a time 🚀

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
ChatWiseThe second fastest AI chatbot™
Tavily Mcp
Amap Maps高德地图官方 MCP Server
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
WindsurfThe new purpose-built IDE to harness magic
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
DeepChatYour AI Partner on Desktop
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Serper MCP ServerA Serper MCP Server
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Playwright McpPlaywright MCP server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
CursorThe AI Code Editor
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.