Sponsored by Deepsite.site

Gitingest Mcp

Created By
deadraid5 months ago
Content

gitingest-mcp

NPM Version

MCP server that converts Git repositories into compact, prompt-ready digests for Large-Language-Models (LLMs).

gitingest-mcp exposes the Model Context Protocol (MCP) over stdio so any LLM-aware editor or tool can request an up-to-date representation of a Git repository—local or remote. The server clones (or analyses a local checkout), applies flexible filters (e.g. .gitignore, size limits, glob patterns), and streams back a single text document containing:

  1. A summary (branch, commit, size, token count…)
  2. A human-readable directory tree
  3. The concatenated contents of every included file

The result is optimised for conversational code understanding, code-review, and RAG style retrieval.


✨ Features

• Works with GitHub, GitLab, Bitbucket, generic git, or a local path
• Optional shallow clone, sparse checkout, or submodule support
• Honor .gitignore and custom .gitingestignore files
• Powerful include/exclude glob patterns
• Hard limits for file size, file count, total size, or token budget
• Built-in retry & timeout logic to survive flaky networks


🚀 Quick Start

# One-off execution via npx
npx -y gitingest-mcp-server

Every MCP-aware client needs a small configuration snippet. Example:

{
  "mcpServers": {
    "gitingest-mcp": {
      "command": "npx",
      "args": ["-y", "gitingest-mcp-server"],
    },
  },
}

(Swap npx for the absolute path to gitingest-mcp-server if you installed the package globally or are running from source.)


🛠️ Available Tools

ingest_repository

Transform a Git repository into an LLM-friendly digest.

ParameterTypeDefaultDescription
repositorystringrequiredGit URL (https://…), SSH (git@…), shorthand (user/repo), or local path
source"github" | "gitlab" | "bitbucket" | "local" | "git"autoForce a specific provider
branch / commit / tagstringCheckout a specific ref
cloneDepthnumber1Depth of shallow clone
sparseCheckoutbooleanfalseEnable sparse checkout when possible
includeSubmodulesbooleanfalseRecursively pull submodules
includeGitignoredbooleanfalseInclude files matched by .gitignore
useGitignorebooleantrueRespect .gitignore when filtering
useGitingestignorebooleantrueRespect .gitingestignore when filtering
excludePatterns / includePatternsstring[]Additional glob patterns
maxFileSizenumberMax single file size (bytes)
maxFilesnumber1000Hard file-count limit
maxTotalSizenumber52428800 (50 MiB)Max combined size (bytes)
maxTokensnumberTrim output after N tokens
tokenstringAuth token for private repos
maxRetriesnumber3Retry attempts for network ops
retryDelaynumber1000Base delay between retries (ms)
timeoutnumber30000Abort the entire operation after N ms

Example call body:

{
  "name": "ingest_repository",
  "arguments": {
    "repository": "deadraid/gitingest-mcp",
    "branch": "main",
    "excludePatterns": ["**/tests/**"],
    "maxFileSize": 50000,
  },
}

🔭 Running from Source

# Clone & install
git clone https://github.com/deadraid/gitingest-mcp.git
cd gitingest-mcp
npm install

# Build TypeScript → dist
npm run build

# Start the MCP server (stdout/stdin)
node dist/server/index.js

During development you can use:

npm run dev   # ts-node with autoreload
npm run test  # vitest
npm run lint  # eslint

✅ Tests

Vitest suites live under tests/ and exercise cancellation, timeouts, schema validation, and advanced edge-cases.

Run all tests with coverage:

npm run test:coverage

📝 License

MIT © 2024 RaidHon

Server Config

{
  "mcpServers": {
    "gitingest-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "gitingest-mcp-server"
      ]
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Playwright McpPlaywright MCP server
WindsurfThe new purpose-built IDE to harness magic
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Serper MCP ServerA Serper MCP Server
Tavily Mcp
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
CursorThe AI Code Editor
DeepChatYour AI Partner on Desktop
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Amap Maps高德地图官方 MCP Server
ChatWiseThe second fastest AI chatbot™