Sponsored by Deepsite.site

Ctx

Created By
context-hub8 months ago
Content

ctx: The missing link between your codebase and your LLM. Context as Code (CaC) tool with MCP server inside.

Docs Json schema Telegram License Latest Version

Good morning, LLM

ctx is a tool made to solve a big problem when chatting with LLMs like ChatGPT or Claude: giving them enough context about your project.

There is an article about Context Generator on Medium that explains the motivation behind the project and the problem it solves.

Instead of manually copying or explaining your entire codebase each time, ctx automatically builds neat, organized context files from:

  • Code files,
  • GitHub repositories,
  • Git commits and diffs
  • Web pages (URLs) with CSS selectors,
  • and plain text.

It was created to solve a common problem: efficiently providing AI language models like ChatGPT, Claude with necessary context about your codebase.

Why you need this

When you're using AI in development, contextt isn't just helpful — it's everything.

  • Code Refactoring Help: Need AI assistance refactoring messy code? ctx creates clean, structured documents with all necessary code files.

  • Multiple Iteration Development: Working through several iterations with an AI helper requires constantly updating the context. ctx automates this process.

  • Documentation Generation: Transform your codebase into comprehensive documentation by combining source code with custom explanations. Use AI to generate user guides, API references, or developer documentation based on your actual code.

  • Seamless AI Integration: hanks to built-in MCP support, you can connect Claude AI directly to your codebase, allowing for real-time, context-aware assistance without manual context sharing.

How it works

  1. Gathers code from files, directories, GitHub repositories, web pages, or plain text.
  2. Targets specific files through pattern matching, content search, size, or date filters
  3. Applies optional modifiers (like extracting PHP signatures without implementation details)
  4. Organizes content into well-structured markdown documents
  5. Saves context files ready to be shared with LLMs
  6. Optionally serves context through an MCP server, allowing AI assistants like Claude to directly access project information

With ctx, your AI conversations just got a whole lot smarter—and easier.

Quick Start

Getting started with Context Generator is straightforward. Follow these simple steps to create your first context file for LLMs.

1. Install Context Generator

Download and install the tool using our installation script:

curl -sSL https://raw.githubusercontent.com/context-hub/generator/main/download-latest.sh | sh

This installs the ctx command to your system (typically in /usr/local/bin).

Want more options? See the complete Installation Guide for alternative installation methods.

2. Initialize a Configuration File

Create a new configuration file in your project directory:

ctx init

This generates a context.yaml file with a basic structure to get you started.

Pro tip: Run ctx init --type=json if you prefer JSON configuration format. Check the Command Reference for all available commands and options.

3. Describe Your Project Structure

Edit the generated context.yaml file to specify what code or content you want to include. For example:

documents:
  - description: "User Authentication System"
    outputPath: "auth-context.md"
    sources:
      - type: file
        description: "Authentication Controllers"
        sourcePaths:
          - src/Auth
        filePattern: "*.php"

      - type: file
        description: "Authentication Models"
        sourcePaths:
          - src/Models
        filePattern: "*User*.php"

This configuration will gather all PHP files from the src/Auth directory and any PHP files containing "User" in their name from the src/Models directory.

Need more advanced configuration?

4. Build the Context

Generate your context file by running:

ctx

The tool will process your configuration and create the specified output file (auth-context.md in our example).

Tip: Configure Logging with -v, -vv, or -vvv for detailed output

5. Share with an LLM

Upload or paste the generated context file to your favorite LLM (like ChatGPT or Claude). Now you can ask specific questions about your codebase, and the LLM will have the necessary context to provide accurate assistance.

Example prompt:

I've shared my authentication system code with you. Can you help me identify potential security vulnerabilities in the user registration process?

Next steps: Check out Development with Context Generator for best practices on integrating context generation into your AI-powered development workflow.

That's it! You're now ready to leverage LLMs with proper context about your codebase.

6. Connect to Claude AI (Optional)

For a more seamless experience, you can connect Context Generator directly to Claude AI using the MCP server:

There is a built-in MCP server that allows you to connect Claude AI directly to your codebase.

Point the MCP client to the Context Generator server:

{
  "mcpServers": {
    "ctx": {
      "command": "ctx server -c /path/to/your/project"
    }
  }
}

Note: Read more about MCP Server for detailed setup instructions.

Now you can ask Claude questions about your codebase without manually uploading context files!

JSON Schema

For better editing experience, Context Generator provides a JSON schema for autocompletion and validation in your IDE:

# Show schema URL
ctx schema

# Download schema to current directory
ctx schema --download

Learn more: See IDE Integration for detailed setup instructions for VSCode, PhpStorm, and other editors.

Full Documentation

For complete documentation, including all available features and configuration options, please visit:

https://docs.ctxgithub.com


License

This project is licensed under the MIT License.

Server Config

{
  "mcpServers": {
    "ctx": {
      "command": "ctx",
      "args": [
        "server",
        "-c",
        "/path/to/project"
      ]
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
WindsurfThe new purpose-built IDE to harness magic
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
ChatWiseThe second fastest AI chatbot™
CursorThe AI Code Editor
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Playwright McpPlaywright MCP server
DeepChatYour AI Partner on Desktop
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Tavily Mcp
Amap Maps高德地图官方 MCP Server
Serper MCP ServerA Serper MCP Server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors