Sponsored by Deepsite.site

Memory For AI

Created By
solonai-com18 days ago
Fast, sub millisecond, local on device long term memory for LLMs, Coding Tools and AI Agents. Recall exact details from yesterday of five years ago. Install in 5 minutes.
Overview

GrantAi

Infinite Memory for AI
Local. Private. Secure.

WebsiteDownloadPricingDocumentation


What is GrantAi?

GrantAi is the shared memory layer for AI agents.

Coordination frameworks are everywhere — CrewAI, AutoGen, LangGraph. But agents still lose everything when a session ends. Context windows reset. Knowledge evaporates. Each agent starts from zero.

GrantAi solves this:

  • Persistent Memory — Knowledge survives sessions, accumulates over time
  • Shared Across Agents — Multiple AI tools read and write to the same brain
  • 12ms Recall — Sub-second retrieval regardless of memory size
  • 100% Local — Your data never leaves your machine
  • AES-256 Encrypted — Secure at rest, zero data egress

Quick Start

macOS / Linux (Native)

# 1. Download from https://solonai.com/grantai/download
# 2. Extract and install
./install.sh YOUR_LICENSE_KEY

# 3. Restart your AI tool (Claude Code, Cursor, etc.)

Docker (All Platforms)

docker pull ghcr.io/solonai-com/grantai-memory:1.8.5

Add to your Claude Desktop config (~/.config/Claude/claude_desktop_config.json):

{
  "mcpServers": {
    "grantai": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "--pull", "always",
               "-v", "grantai-data:/data",
               "-e", "GRANTAI_LICENSE_KEY=YOUR_KEY",
               "ghcr.io/solonai-com/grantai-memory:1.8.5"]
    }
  }
}

Supported Platforms

PlatformMethodStatus
macOS (Apple Silicon)Native
Linux (x64)Native
WindowsDocker
All PlatformsDocker

MCP Tools

GrantAi provides these tools to your AI:

ToolDescription
grantai_inferQuery memory for relevant context
grantai_teachStore content for future recall
grantai_learnImport files or directories
grantai_healthCheck server status
grantai_summarizeStore session summaries
grantai_projectTrack project state
grantai_snippetStore code patterns
grantai_gitImport git commit history
grantai_captureSave conversation turns for continuity

Multi-Agent Memory Sharing

Multiple agents can share knowledge through GrantAi's memory layer.

Basic shared memory (no setup required)

# Any agent stores
grantai_teach(
    content="API rate limit is 100 requests/minute.",
    source="api-notes"
)

# Any agent retrieves
grantai_infer(input="API rate limiting")

All agents read from and write to the same memory pool. No configuration needed.

With agent attribution (optional)

Use speaker to track which agent stored what, and from_agents to filter retrieval:

# Store with identity
grantai_teach(
    content="API uses Bearer token auth.",
    source="api-research",
    speaker="researcher"  # optional
)

# Retrieve from specific agent
grantai_infer(
    input="API authentication",
    from_agents=["researcher"]  # optional filter
)

When to use speaker

ScenarioUse speaker?Why
Shared knowledge baseNoAll contributions equal, no filtering needed
Session continuityNoSame context, just persist and retrieve
Research → Code handoffYesCoder filters for researcher's findings only
Role-based trustYesSecurity agent's input treated differently

Framework integration

GrantAi works with any MCP-compatible client. Point your agents at the same GrantAi instance:

{
  "mcpServers": {
    "grantai": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "--pull", "always",
               "-v", "grantai-data:/data",
               "-e", "GRANTAI_LICENSE_KEY=YOUR_KEY",
               "ghcr.io/solonai-com/grantai-memory:1.8.5"]
    }
  }
}

All agents using this config share the same memory volume (grantai-data).

Pricing

  • Free Trial — 30 days, no credit card required
  • Personal — $29/month or $299/year
  • Team — $25/seat/month

View full pricing →

Documentation

Support

License

GrantAi is proprietary software. See Terms of Service.


Get Started →

Server Config

{
  "mcpServers": {
    "grantai": {
      "command": "grantai-mcp",
      "env": {
        "GRANTAI_LICENSE_KEY": "your-key-here"
      }
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
ChatWiseThe second fastest AI chatbot™
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
WindsurfThe new purpose-built IDE to harness magic
RedisA Model Context Protocol server that provides access to Redis databases. This server enables LLMs to interact with Redis key-value stores through a set of standardized tools.
DeepChatYour AI Partner on Desktop
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Playwright McpPlaywright MCP server
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Tavily Mcp
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Serper MCP ServerA Serper MCP Server
CursorThe AI Code Editor
Amap Maps高德地图官方 MCP Server
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。