Sponsored by Deepsite.site

Tome - Magical AI Spellbook

Created By
runebookaia year ago
a magical desktop app that puts the power of LLMs and MCP in the hands of everyone
Content

Tome - Magical AI Spellbook

Tome

a magical desktop app that puts the power of LLMs and MCP in the hands of everyone

Join Us on Discord License: Apache 2.0 GitHub Release

๐Ÿ”ฎ Download the Tome Desktop App: Windows | MacOS

Tome

Tome is a desktop app that lets anyone harness the magic of LLMs and MCP. Download Tome, connect any local or remote LLM and hook it up to thousands of MCP servers to create your own magical AI-powered spellbook.

๐Ÿซฅ Want it to be 100% local, 100% private? Use Ollama and Qwen3 with only local MCP servers to cast spells in your own pocket universe. โšก Want state of the art cloud models with the latest remote MCP servers? You can have that too. It's all up to you!

๐Ÿ—๏ธ This is a Technical Preview so bear in mind things will be rough around the edges. Join us on Discord to share tips, tricks, and issues you run into. Star this repo to stay on top of updates and feature releases!

๐Ÿช„ Features

  • ๐Ÿง™ Streamlined Beginner Friendly Experience
    • Simply download and install Tome and hook up the LLM of your choice
    • No fiddling with JSON, Docker, python or node
  • ๐Ÿค– AI Model Support
    • Remote: Google Gemini, OpenAI, any OpenAI API-compatible endpoint
    • Local: Ollama, LM Studio, Cortex, any OpenAI API-compatible endpoint
  • ๐Ÿ”ฎ Enhanced MCP support
    • UI to install, remove, turn on/off MCP servers
    • npm, uvx, node, python MCP servers all supported out of box
  • ๐Ÿช Integration into Smithery.ai registry
    • Thousands of MCP servers available via one-click installation
  • โœ๏ธ Customization of context windows and temperature
  • ๐Ÿงฐ Native support for tool calls and reasoning models
    • UI enhancements that clearly delineate tool calls and thinking messages

Demo

https://github.com/user-attachments/assets/0775d100-3eba-4219-9e2f-360a01f28cce

Getting Started

Requirements

Quickstart

  1. Install Tome
  2. Connect your preferred LLM provider - OpenAI, Ollama and Gemini are preset but you can also add providers like LM Studio by using http://localhost:1234/v1 as the URL
  3. Open the MCP tab in Tome and install your first MCP server (Fetch is an easy one to get started with, just paste uvx mcp-server-fetch into the server field).
  4. Chat with your MCP-powered model! Ask it to fetch the top story on Hacker News.

Vision

We want to make local LLMs and MCP accessible to everyone. We're building a tool that allows you to be creative with LLMs, regardless of whether you're an engineer, tinkerer, hobbyist, or anyone in between.

Core Principles

  • Tome is local first: You are in control of where your data goes.
  • Tome is for everyone: You shouldn't have to manage programming languages, package managers, or json config files.

What's Next

We've gotten a lot of amazing feedback in the last few weeks since releasing Tome but we've got big plans for the future. We want to break LLMs out of their chatbox, and we've got a lot of features coming to help y'all do that.

  • Scheduled tasks: LLMs should be doing helpful things even when you're not in front of the computer.
  • Native integrations: MCP servers are a great way to access tools and information, but we want to add more powerful integrations to interact with LLMs in unique. ways
  • App builder: we believe long term that the best experiences will not be in a chat interface. We have plans to add additional tools that will enable you to create powerful applications and workflows.
  • ??? Let us know what you'd like to see! Join our community via the links below, we'd love to hear from you.

Community

Discord Blog Bluesky Twitter

Recommend Clients
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
HyperChatHyperChat is a Chat client that strives for openness, utilizing APIs from various LLMs to achieve the best Chat experience, as well as implementing productivity tools through the MCP protocol.
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
WindsurfThe new purpose-built IDE to harness magic
chatmcpChatMCP is an AI chat client implementing the Model Context Protocol (MCP).
Roo Code (prev. Roo Cline)Roo Code (prev. Roo Cline) gives you a whole dev team of AI agents in your code editor.
Refact.aiOpen-source AI Agent for VS Code and JetBrains that autonomously solves coding tasks end-to-end.
Continueโฉ Create, share, and use custom AI code assistants with our open-source IDE extensions and hub of models, rules, prompts, docs, and other building blocks
LINKAGOGO-MCP---BOOKMARK-MANAGERManage your LinkaGoGo bookmarks through any AI assistant that supports the Model Context Protocol (MCP). Search, add, organize, tag, move, and export bookmarks conversationally โ€” 16 tools for full bookmark and folder management. Connect via Claude.ai, Claude Desktop, or any MCP-compatible client.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
MCP ConnectEnables cloud-based AI services to access local Stdio based MCP servers via HTTP requests
Cline โ€“ #1 on OpenRouterAutonomous coding agent right in your IDE, capable of creating/editing files, executing commands, using the browser, and more with your permission every step of the way.
DeepChatYour AI Partner on Desktop
y-cli ๐Ÿš€A Tiny Terminal Chat App for AI Models with MCP Client Support
ChatWiseThe second fastest AI chatbotโ„ข
CursorThe AI Code Editor
MCP PlaygroundCall MCP Server Tools Online
Cherry Studio๐Ÿ’ Cherry Studio is a desktop client that supports for multiple LLM providers.
ZedCode at the speed of thought โ€“ Zed is a high-performance, multiplayer code editor from the creators of Atom and Tree-sitter.
A Sleek AI Assistant & MCP Client5ire is a cross-platform desktop AI assistant, MCP client. It compatible with major service providers, supports local knowledge base and tools via model context protocol servers .
LutraLutra is the first MCP compatible client built for everyone