Sponsored by Deepsite.site

DeepSeek MCP Server

Created By
chew-z8 months ago
Simple MCP server in Golang to redirect questions to Deepseek models
Content

DeepSeek MCP Server

A production-grade MCP server integrating with DeepSeek's API, featuring advanced code review capabilities, efficient file management, and API account management.

Features

  • Multi-Model Support: Choose from various DeepSeek models including DeepSeek Chat and DeepSeek Coder
  • Code Review Focus: Built-in system prompt for detailed code analysis with markdown output
  • Automatic File Handling: Built-in file management with direct path integration
  • API Account Management: Check balance and estimate token usage
  • JSON Mode Support: Request structured JSON responses for easy parsing
  • Advanced Error Handling: Graceful degradation with structured error logging
  • Improved Retry Logic: Automatic retries with configurable exponential backoff for API calls
  • Security: Configurable file type restrictions and size limits
  • Performance Monitoring: Built-in metrics collection for request latency and throughput

Prerequisites

  • Go 1.21+
  • DeepSeek API key
  • Basic understanding of MCP protocol

Installation & Quick Start

# Clone and build
git clone https://github.com/your-username/DeepseekMCP
cd DeepseekMCP
go build -o bin/mcp-deepseek

# Start server with environment variables
export DEEPSEEK_API_KEY=your_api_key
export DEEPSEEK_MODEL=deepseek-chat
./bin/mcp-deepseek

Configuration

Claude Desktop

{
  "mcpServers": {
    "deepseek": {
      "command": "/Your/project/path/bin/mcp-deepseek",
      "env": {
        "DEEPSEEK_API_KEY": "YOUR_API_KEY",
        "DEEPSEEK_MODEL": "deepseek-chat"
      }
    }
  }
}

Environment Variables

VariableDescriptionDefault
DEEPSEEK_API_KEYDeepSeek API keyRequired
DEEPSEEK_MODELModel ID from available modelsdeepseek-chat
DEEPSEEK_SYSTEM_PROMPTSystem prompt for code reviewDefault code review prompt
DEEPSEEK_SYSTEM_PROMPT_FILEPath to file containing system promptEmpty
DEEPSEEK_MAX_FILE_SIZEMax upload size (bytes)10485760 (10MB)
DEEPSEEK_ALLOWED_FILE_TYPESComma-separated MIME types[Common text/code types]
DEEPSEEK_TIMEOUTAPI timeout in seconds90
DEEPSEEK_MAX_RETRIESMax API retries2
DEEPSEEK_INITIAL_BACKOFFInitial backoff time (seconds)1
DEEPSEEK_MAX_BACKOFFMaximum backoff time (seconds)10
DEEPSEEK_TEMPERATUREModel temperature (0.0-1.0)0.4

Example .env:

DEEPSEEK_API_KEY=your_api_key
DEEPSEEK_MODEL=deepseek-chat
DEEPSEEK_SYSTEM_PROMPT="Your custom code review prompt here"
# Alternative: load system prompt from file
# DEEPSEEK_SYSTEM_PROMPT_FILE=/path/to/prompt.txt
DEEPSEEK_MAX_FILE_SIZE=5242880  # 5MB
DEEPSEEK_ALLOWED_FILE_TYPES=text/x-go,text/markdown
DEEPSEEK_TEMPERATURE=0.7

Core API Tools

Currently, the server provides the following tools:

deepseek_ask

Used for code analysis, review, and general queries with optional file path inclusion.

{
  "name": "deepseek_ask",
  "arguments": {
    "query": "Review this Go code for concurrency issues...",
    "model": "deepseek-chat",
    "systemPrompt": "Optional custom review instructions",
    "file_paths": ["main.go", "config.go"],
    "json_mode": false
  }
}

deepseek_models

Lists all available DeepSeek models with their capabilities.

{
  "name": "deepseek_models",
  "arguments": {}
}

deepseek_balance

Checks your DeepSeek API account balance and availability status.

{
  "name": "deepseek_balance",
  "arguments": {}
}

deepseek_token_estimate

Estimates the token count for text or a file to help with quota management.

{
  "name": "deepseek_token_estimate",
  "arguments": {
    "text": "Your text to estimate...",
    "file_path": "path/to/your/file.go"
  }
}

Supported Models

The following DeepSeek models are supported by default:

Model IDDescription
deepseek-chatGeneral-purpose chat model balancing performance and efficiency
deepseek-coderSpecialized model for coding and technical tasks
deepseek-reasonerModel optimized for reasoning and problem-solving tasks

Note: The actual available models may vary based on your API access level. The server will automatically discover and make available all models you have access to through the DeepSeek API.

Supported File Types

ExtensionMIME Type
.gotext/x-go
.pytext/x-python
.jstext/javascript
.mdtext/markdown
.javatext/x-java
.c/.htext/x-c
.cpp/.hpptext/x-c++
25+ more(See getMimeTypeFromPath in deepseek.go)

Operational Notes

  • Degraded Mode: Automatically enters safe mode on initialization errors
  • Audit Logging: All operations logged with timestamps and metadata
  • Security: File content validated by MIME type and size before processing

File Handling

The server handles files directly through the deepseek_ask tool:

  1. Specify local file paths in the file_paths array parameter
  2. The server automatically:
    • Reads the files from the provided paths
    • Determines the correct MIME type based on file extension
    • Uploads the file content to the DeepSeek API
    • Uses the files as context for the query

This direct file handling approach eliminates the need for separate file upload/management endpoints.

JSON Mode Support

For integrations that require structured data output, the server supports JSON mode:

  • Structured Responses: Request properly formatted JSON responses from DeepSeek models
  • Parser-Friendly: Ideal for CI/CD pipelines and automation systems
  • Easy Integration: Simply set json_mode: true in your request

Example with JSON mode:

{
  "name": "deepseek_ask",
  "arguments": {
    "query": "Analyze this code and return a JSON object with: issues_found (array of strings), complexity_score (number 1-10), and recommendations (array of strings)",
    "model": "deepseek-chat",
    "json_mode": true,
    "file_paths": ["main.go", "config.go"]
  }
}

This returns a well-formed JSON response that can be parsed directly by your application.

Development

Command-line Options

The server supports these command-line options to override environment variables:

# Override the DeepSeek model to use
./bin/mcp-deepseek -deepseek-model=deepseek-coder

# Override the system prompt
./bin/mcp-deepseek -deepseek-system-prompt="Your custom prompt here"

# Override the temperature setting (0.0-1.0)
./bin/mcp-deepseek -deepseek-temperature=0.8

Running Tests

To run tests:

go test -v ./...

Running Linter

golangci-lint run

Formatting Code

gofmt -w .

License

MIT License

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the project
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Playwright McpPlaywright MCP server
Tavily Mcp
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
DeepChatYour AI Partner on Desktop
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
CursorThe AI Code Editor
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Amap Maps高德地图官方 MCP Server
ChatWiseThe second fastest AI chatbot™
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Serper MCP ServerA Serper MCP Server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
WindsurfThe new purpose-built IDE to harness magic
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.