- Altinity Mcp
Altinity Mcp
Altinity MCP Server is a production-ready MCP server designed to empower AI agents and LLMs to interact seamlessly with ClickHouse. It exposes your ClickHouse database as a set of standardized tools and resources that adhere to the MCP protocol, making it easy for agents built on OpenAI, Claude, or other platforms to query, explore, and analyse your data.
Why use this server?
- Seamless AI-agent integration: Designed so that agents built using OpenAI can call your database as if it were a tool.
- Flexible transport support: STDIO for local workflows, HTTP for traditional REST-style calls + streaming support via SSE for interactive flows.
- Full tooling and protocol support: Built-in tools for schema introspection, SQL execution, resource discovery.
- Security and enterprise-grade: Supports JWE/JWT authentication, TLS for ClickHouse connection and MCP endpoints.
- Open-source and extensible: You can customise, extend, embed into your stack.
Key Features
-
Transport Options:
- STDIO: Run locally via standard input/output — ideal for embedded agents or local workflows.
- HTTP: Exposes MCP tools as HTTP endpoints, enabling Web, backend, agent access.
- SSE (Server-Sent Events): Enables streaming responses — useful when you want the agent to receive chunks of results, respond interactively, or present live data.
-
OpenAPI Integration: When HTTP or SSE mode is enabled, the server can generate a full OpenAPI specification (v3) describing all tools and endpoints. This makes it easy for OpenAI-based agents (or other LLM platforms) to discover and call your tools programmatically.
-
Security & Authentication: Optional JWE token authentication, JWT signing, TLS support both for the MCP server and the underlying ClickHouse connection.
-
Dynamic Resource Discovery: The server can introspect the ClickHouse schema and automatically generate MCP “resources” (tables, views, sample data) so agents understand your data context without manual intervention.
-
Configuration Flexibility: Configure via environment variables, YAML/JSON configuration file or CLI flags. Includes hot-reload support so you can adjust config without full restart.
Use-Cases
- AI assistant integrated with OpenAI: For example, you build an agent using OpenAI’s API which reads your schema via the OpenAPI spec, selects the right tool, calls the HTTP/SSE endpoint of the MCP server, and returns analytic results to the user.
- Streaming analytics: Large result sets, or interactive analytics flows, where SSE streaming gives progressive results, keeps your UI or agent responsive.
- Secure enterprise access: Instead of giving agents full DB credentials, you expose via the MCP server with fine-grained auth, limit enforcement, TLS, and tool-level control.
- Schema-aware LLM workflows: Because the server exposes table and column metadata and sample rows as resources, the LLM can reason about your data structure, reducing errors and generating better SQL or queries.
Server Config
{
"mcpServers": {
"altinity-mcp": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"JWE_SECRET_KEY=<string-for-decrypt-auth-tokens>",
"ghcr.io/altinity/altinity-mcp"
],
"env": {
"JWE_SECRET_KEY": "<string-for-decrypt-auth-tokens>"
}
}
}
}