Sponsored by Deepsite.site

Cocoindex Code

Created By
cocoindex-ioa month ago
An MCP (Model Context Protocol) server for indexing and querying codebases using CocoIndex. Features semantic code search with natural language queries, incremental indexing for fast updates, multi-language support (Python, JavaScript/TypeScript, Rust, Go, Java, C/C++, C#, SQL, Shell), flexible embeddings via local SentenceTransformers or 100+ cloud providers via LiteLLM, and portable SQLite storage with no external database required.
Overview

CocoIndex Code

An MCP (Model Context Protocol) server for indexing and querying codebases using CocoIndex.

Features

  • Semantic Code Search: Find relevant code using natural language queries
  • Incremental Indexing: Only re-indexes changed files for fast updates
  • Multi-Language Support: Python, JavaScript/TypeScript, Rust, Go, Java, C/C++, C#, SQL, Shell
  • Flexible Embeddings: Local SentenceTransformers (default) or 100+ cloud providers via LiteLLM
  • SQLite Storage: Portable, no external database required

Prerequisites

Install uv (which provides uvx):

curl -LsSf https://astral.sh/uv/install.sh | sh

Usage with Claude Code

No installation needed — uvx runs it directly.

Default (Local Embeddings)

Uses a local SentenceTransformers model (sentence-transformers/all-MiniLM-L6-v2). No API key required:

claude mcp add cocoindex-code \
  -- uvx --prerelease=explicit --with "cocoindex>=1.0.0a13" cocoindex-code@latest

With a Cloud or Custom Embedding Model

Set COCOINDEX_CODE_EMBEDDING_MODEL to any LiteLLM-supported model, along with the provider's API key:

Ollama (Local)
claude mcp add cocoindex-code \
  -e COCOINDEX_CODE_EMBEDDING_MODEL=ollama/nomic-embed-text \
  -- uvx --prerelease=explicit --with "cocoindex>=1.0.0a13" cocoindex-code@latest

Set OLLAMA_API_BASE if your Ollama server is not at http://localhost:11434.

OpenAI
claude mcp add cocoindex-code \
  -e COCOINDEX_CODE_EMBEDDING_MODEL=text-embedding-3-small \
  -e OPENAI_API_KEY=your-api-key \
  -- uvx --prerelease=explicit --with "cocoindex>=1.0.0a13" cocoindex-code@latest
Azure OpenAI
claude mcp add cocoindex-code \
  -e COCOINDEX_CODE_EMBEDDING_MODEL=azure/your-deployment-name \
  -e AZURE_API_KEY=your-api-key \
  -e AZURE_API_BASE=https://your-resource.openai.azure.com \
  -e AZURE_API_VERSION=2024-06-01 \
  -- uvx --prerelease=explicit --with "cocoindex>=1.0.0a13" cocoindex-code@latest
Gemini
claude mcp add cocoindex-code \
  -e COCOINDEX_CODE_EMBEDDING_MODEL=gemini/text-embedding-004 \
  -e GEMINI_API_KEY=your-api-key \
  -- uvx --prerelease=explicit --with "cocoindex>=1.0.0a13" cocoindex-code@latest
Mistral
claude mcp add cocoindex-code \
  -e COCOINDEX_CODE_EMBEDDING_MODEL=mistral/mistral-embed \
  -e MISTRAL_API_KEY=your-api-key \
  -- uvx --prerelease=explicit --with "cocoindex>=1.0.0a13" cocoindex-code@latest
Voyage (Code-Optimized)
claude mcp add cocoindex-code \
  -e COCOINDEX_CODE_EMBEDDING_MODEL=voyage/voyage-code-3 \
  -e VOYAGE_API_KEY=your-api-key \
  -- uvx --prerelease=explicit --with "cocoindex>=1.0.0a13" cocoindex-code@latest
Cohere
claude mcp add cocoindex-code \
  -e COCOINDEX_CODE_EMBEDDING_MODEL=cohere/embed-english-v3.0 \
  -e COHERE_API_KEY=your-api-key \
  -- uvx --prerelease=explicit --with "cocoindex>=1.0.0a13" cocoindex-code@latest
AWS Bedrock
claude mcp add cocoindex-code \
  -e COCOINDEX_CODE_EMBEDDING_MODEL=bedrock/amazon.titan-embed-text-v2:0 \
  -e AWS_ACCESS_KEY_ID=your-access-key \
  -e AWS_SECRET_ACCESS_KEY=your-secret-key \
  -e AWS_REGION_NAME=us-east-1 \
  -- uvx --prerelease=explicit --with "cocoindex>=1.0.0a13" cocoindex-code@latest
Nebius
claude mcp add cocoindex-code \
  -e COCOINDEX_CODE_EMBEDDING_MODEL=nebius/BAAI/bge-en-icl \
  -e NEBIUS_API_KEY=your-api-key \
  -- uvx --prerelease=explicit --with "cocoindex>=1.0.0a13" cocoindex-code@latest

Any model supported by LiteLLM works — see the full list of embedding providers.

Setup .gitignore

Add the index directory to your .gitignore so it isn't committed:

echo .cocoindex_code >> .gitignore

Uninstall

Remove the MCP server and delete the local index:

claude mcp remove cocoindex-code
rm -rf .cocoindex_code

Configuration

VariableDescriptionDefault
COCOINDEX_CODE_ROOT_PATHRoot path of the codebaseAuto-discovered (see below)
COCOINDEX_CODE_EMBEDDING_MODELEmbedding model (see below)sbert/sentence-transformers/all-MiniLM-L6-v2

Embedding Model

The COCOINDEX_CODE_EMBEDDING_MODEL variable uses a prefix to select the embedding backend:

  • sbert/ prefix — uses SentenceTransformers (runs locally, no API key needed). Example: sbert/sentence-transformers/all-MiniLM-L6-v2
  • Otherwise — uses LiteLLM (supports 100+ providers). Example: text-embedding-3-small

Root Path Discovery

If COCOINDEX_CODE_ROOT_PATH is not set, the codebase root is discovered by:

  1. Finding the nearest parent directory containing .cocoindex_code/
  2. Finding the nearest parent directory containing .git/
  3. Falling back to the current working directory

MCP Tools

Search the codebase using semantic similarity.

search(
    query: str,               # Natural language query or code snippet
    limit: int = 10,          # Maximum results (1-100)
    offset: int = 0,          # Pagination offset
    refresh_index: bool = True  # Refresh index before querying
)

The refresh_index parameter controls whether the index is refreshed before searching:

  • True (default): Refreshes the index to include any recent changes
  • False: Skip refresh for faster consecutive queries

Returns matching code chunks with:

  • File path
  • Language
  • Code content
  • Line numbers (start/end)
  • Similarity score

Index Storage

The index is stored in .cocoindex_code/ under your codebase root:

your-project/
├── .cocoindex_code/
│   ├── target_sqlite.db  # Vector index (SQLite + sqlite-vec)
│   └── cocoindex.db/     # CocoIndex state
├── src/
│   └── ...

Add .cocoindex_code/ to your .gitignore.

Supported File Types

  • Python: .py, .pyi
  • JavaScript: .js, .jsx, .mjs, .cjs
  • TypeScript: .ts, .tsx
  • Rust: .rs
  • Go: .go
  • Java: .java
  • C: .c, .h
  • C++: .cpp, .hpp, .cc, .cxx, .hxx, .hh
  • C#: .cs
  • SQL: .sql
  • Shell: .sh, .bash, .zsh
  • Markdown: .md, .mdx
  • Plain Text: .txt, .rst

Common generated directories are automatically excluded:

  • __pycache__/
  • node_modules/
  • target/
  • dist/
  • vendor/ (Go vendored dependencies, matched by domain-based child paths)

Development

Local Testing with Claude Code

To test locally without installing the package, use the Claude Code CLI:

claude mcp add cocoindex-code \
  -- uv run --project /path/to/cocoindex-code cocoindex-code

Or add to .mcp.json in your project root:

{
  "mcpServers": {
    "cocoindex-code": {
      "command": "uv",
      "args": ["run", "--project", "/path/to/cocoindex-code", "cocoindex-code"]
    }
  }
}

Running Tests

# Install dev dependencies
uv sync --group dev

# Run tests
uv run pytest tests/ -v

# Run pre-commit hooks
uv run pre-commit run --all-files

License

Apache-2.0

Server Config

{
  "mcpServers": {
    "cocoindex-code": {
      "command": "uvx",
      "args": [
        "--prerelease=explicit",
        "--with",
        "cocoindex>=1.0.0a13",
        "cocoindex-code@latest"
      ]
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
ChatWiseThe second fastest AI chatbot™
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Tavily Mcp
Playwright McpPlaywright MCP server
Serper MCP ServerA Serper MCP Server
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
DeepChatYour AI Partner on Desktop
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
WindsurfThe new purpose-built IDE to harness magic
Amap Maps高德地图官方 MCP Server
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
CursorThe AI Code Editor
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
RedisA Model Context Protocol server that provides access to Redis databases. This server enables LLMs to interact with Redis key-value stores through a set of standardized tools.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。