Sponsored by Deepsite.site

Modelscope Mcp Server

Created By
modelscope5 months ago
A MCP server that integrates with ModelScope's ecosystem, providing seamless access to AI models, datasets, apps, papers, and generation capabilities through popular MCP clients.
Content

ModelScope MCP Server

PyPI - Version Docker Docker Hub License

A Model Context Protocol (MCP) server that integrates with ModelScope's ecosystem, providing seamless access to AI models, datasets, apps, papers, and generation capabilities through popular MCP clients.

✨ Features

  • 🔐 User Authentication - Retrieve information about the currently authenticated ModelScope user
  • 🎨 AI Image Generation - Generate images from text prompts or transform existing images using AIGC models (supports both text-to-image and image-to-image generation)
  • 🔍 Model Search - Search for machine learning models on ModelScope with advanced filtering options (task type, author, inference support, etc.)
  • 📚 Research Paper Search - Search for arXiv papers indexed in ModelScope with comprehensive metadata
  • 📖 Documentation Search (Coming Soon) - Semantic search for ModelScope documentation and articles
  • 🚀 Gradio API Integration (Coming Soon) - Invoke Gradio APIs exposed by any pre-configured ModelScope studio

🚀 Quick Start

1. Get Your API Token

  1. Visit ModelScope and sign in to your account
  2. Navigate to [Home] → [Access Tokens] to retrieve your default API token or create a new one

📖 For detailed instructions, refer to the ModelScope Token Documentation

2. Integration with MCP Clients

Add the following JSON configuration to your MCP client's configuration file:

{
  "mcpServers": {
    "modelscope-mcp-server": {
      "command": "uvx",
      "args": ["modelscope-mcp-server"],
      "env": {
        "MODELSCOPE_API_TOKEN": "your-api-token"
      }
    }
  }
}

Or, you can use the pre-built Docker image:

{
  "mcpServers": {
    "modelscope-mcp-server": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "MODELSCOPE_API_TOKEN",
        "spadrian/modelscope-mcp-server:latest"
      ],
      "env": {
        "MODELSCOPE_API_TOKEN": "your-api-token"
      }
    }
  }
}

Refer to the MCP JSON Configuration Standard for more details.

This format is widely adopted across the MCP ecosystem:

  • Cherry Studio: See Cherry Studio MCP Configuration
  • Claude Desktop: Uses ~/.claude/claude_desktop_config.json
  • Cursor: Uses ~/.cursor/mcp.json
  • VS Code: Uses workspace .vscode/mcp.json
  • Other clients: Many MCP-compatible applications follow this standard

🛠️ Development

Environment Setup

  1. Clone and Setup:

    git clone https://github.com/modelscope/modelscope-mcp-server.git
    cd modelscope-mcp-server
    uv sync
    
  2. Activate Environment:

    source .venv/bin/activate  # Linux/macOS
    # or via your IDE
    
  3. Set Your API Token Environment Variable:

    export MODELSCOPE_API_TOKEN="your-api-token"
    

    Or, you can set the API token in the .env file (under the project root) for convenience:

    MODELSCOPE_API_TOKEN="your-api-token"
    

Running the Demo Script

Run a quick demo to explore the server's capabilities:

uv run python demo.py

Use the --full flag to demonstrate all available features:

uv run python demo.py --full

Running the Server Locally

# Standard stdio transport (default)
uv run modelscope-mcp-server

# Streamable HTTP transport for web integration
uv run modelscope-mcp-server --transport http

# HTTP/SSE transport with custom port (default: 8000)
uv run modelscope-mcp-server --transport [http/sse] --port 8080

For HTTP/SSE mode, connect using a local URL in your MCP client configuration:

{
  "mcpServers": {
    "modelscope-mcp-server": {
      "url": "http://127.0.0.1:8000/mcp/"
    }
  }
}

You can also debug the server using the MCP Inspector tool:

npx @modelcontextprotocol/inspector uv run modelscope-mcp-server

The above command uses stdio transport by default; you can switch to HTTP or SSE in the Web UI as needed.

Testing

Run the complete test suite:

# Basic test run
uv run pytest

# Run tests for a specific file
uv run pytest tests/test_search_papers.py

# With coverage report
uv run pytest --cov=src --cov=examples --cov-report=html

Code Quality

This project uses pre-commit hooks for automated code formatting, linting, and type checking:

# Install hooks
uv run pre-commit install

# Run all checks manually
uv run pre-commit run --all-files

All PRs must pass these checks and include appropriate tests.

📦 Release Management

TODO: trigger release from GitHub Actions

Release to PyPI

python scripts/pypi_release.py

Release to Docker Hub

docker login

# Release to Docker Hub (will auto-detect buildx or use traditional build)
python scripts/docker_release.py

# Release to Docker Hub (use traditional multi-arch build with manifest)
python scripts/docker_release.py --traditional-multiarch

🤝 Contributing

We welcome contributions! Please ensure that:

  1. All PRs include relevant tests and pass the full test suite
  2. Code follows our style guidelines (enforced by pre-commit hooks)
  3. Documentation is updated for new features
  4. Commit messages follow conventional commit format

📚 References

📜 License

This project is licensed under the Apache License (Version 2.0).

Server Config

{
  "mcpServers": {
    "modelscope-mcp-server": {
      "command": "uvx",
      "args": [
        "modelscope-mcp-server"
      ],
      "env": {
        "MODELSCOPE_API_TOKEN": "your-api-token"
      }
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Amap Maps高德地图官方 MCP Server
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Tavily Mcp
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
DeepChatYour AI Partner on Desktop
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
CursorThe AI Code Editor
ChatWiseThe second fastest AI chatbot™
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
WindsurfThe new purpose-built IDE to harness magic
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Playwright McpPlaywright MCP server
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Serper MCP ServerA Serper MCP Server