- Workflowy Local Mcp
Workflowy Local Mcp
Workflowy Local MCP
A desktop app that gives AI assistants full read/write access to your Workflowy account with fast local search. Unlike simple API wrappers, it caches your entire outline in a local SQLite database with full-text search - so your AI can instantly search across hundreds of thousands of nodes without hitting API rate limits. Your API key never leaves your machine.
Why This Exists
Workflowy's API has no search endpoint and a strict 1 request/minute rate limit on full exports. That makes direct API access painfully slow for AI assistants that need to explore your outline. This server solves that with local caching, instant full-text search, bookmarks with context, and AI-configurable instructions - all through a simple desktop app.
Features
- 7 tools for managing Workflowy nodes (read, edit, search, sync, bookmarks)
- Local SQLite cache with fast full-text search across all your nodes
- Bookmarks to save frequently-used node locations with context notes
- AI Instructions — create an "AI Instructions" node in Workflowy to customize LLM behavior across sessions
- Sync-on-access — reads auto-sync from the Workflowy API so data is always fresh
- Customizable server instructions and tool descriptions to tune AI behavior
- MCP logging — view server activity in the desktop app in real time
- Fully local — your API key never leaves your machine
Installation
-
Download the latest release from GitHub Releases
- macOS:
.dmgfile - Windows:
.msior.exeinstaller
- macOS:
-
macOS users: The app is unsigned, so you'll need to bypass Gatekeeper:
- Option 1: Right-click the app → select "Open" → click "Open" in the dialog
- Option 2: Run
xattr -cr /path/to/Workflowy\ Local\ MCP.appin Terminal - If you see "damaged and can't be opened", use Option 2
-
Open the app and enter your Workflowy API key
- Get one at workflowy.com/api-reference
-
Go to the Setup tab and follow the instructions for your MCP client (Claude Code, Claude Desktop, Cursor, or any app that supports MCP)
-
Restart your MCP client — the Workflowy tools are now available
Available Tools
| Tool | Description |
|---|---|
list_bookmarks | List all saved bookmarks and the user's custom AI instructions. Call this at the start of every conversation. |
save_bookmark | Save a node ID with a friendly name and context notes for future sessions |
delete_bookmark | Delete a saved bookmark by name |
read_doc | Read a node and its children via the LLM Doc API. Supports calendar targets (today, tomorrow, next_week, inbox) and configurable depth (1-10). Returns tag-as-key JSON format. |
edit_doc | Edit nodes via the LLM Doc API. Supports insert, update, and delete operations in a single call. Can create nested structures with one request. |
search_nodes | Search locally cached nodes by text. Returns results with breadcrumb paths and a preview of each result's children. |
sync_nodes | Full sync of all Workflowy nodes to local cache (rate limited to 1 request per minute) |
How It Works
The server uses Workflowy's LLM Doc API for reads and writes, with a local SQLite cache for search:
- LLM Doc API:
read_docandedit_doccall Workflowy's/api/llm/doc/read/and/api/llm/doc/editendpoints directly for real-time access - Calendar targets: Use
today,tomorrow,next_week, orinboxas node IDs — the API handles date resolution automatically - Batch operations:
edit_doccan perform multiple insert/update/delete operations in a single API call - Local cache for search:
search_nodesuses a SQLite cache that auto-syncs when stale (>1 hour) - Rate limiting: The Workflowy
nodes-exportAPI (used for cache sync) is limited to 1 request per minute; the LLM Doc API has no rate limit
Desktop App
The Tauri-based desktop app provides a UI for configuration and monitoring:
- API Key — enter and validate your Workflowy API key
- Tools — customize server instructions and individual tool descriptions to tune AI behavior
- Setup — copy-paste configuration snippets for Claude Code, Claude Desktop, and Cursor
- Bookmarks — view, edit context, and delete saved bookmarks
- Cache — view cache status and trigger a manual sync
- Logs — view MCP server logs in real time (auto-refreshes every 3 seconds)
AI Instructions
You can create a node called "AI Instructions" in Workflowy to set persistent preferences for how LLMs interact with your data. For example:
- "Always add new tasks to my #inbox"
- "Use checkboxes for tasks, not bullets"
- "My calendar is under 'Daily Notes > 2025'"
The LLM will search for this node, save it as a reserved ai_instructions bookmark, and automatically load it at the start of every conversation.
Data Storage
All data is stored locally in the app data directory:
| Platform | Path |
|---|---|
| macOS | ~/Library/Application Support/com.workflowy.local-mcp/ |
| Windows | %APPDATA%\com.workflowy.local-mcp\ |
| Linux | ~/.local/share/com.workflowy.local-mcp/ |
Files stored: config.json (settings), bookmarks.db (SQLite database with bookmarks and node cache), mcp-logs.json (server logs).
The API key can also be set via the WORKFLOWY_API_KEY environment variable instead of the app config.
Works With
- Claude Desktop
- Claude Code
- Cursor
- Any application that supports the Model Context Protocol
Building from Source
Requires Node.js 18+ and Rust.
npm install
npm run tauri build
This runs the frontend build (tsc && vite build), bundles the MCP server with esbuild (npm run build:mcp), and packages everything into a Tauri desktop app.
Server Config
{
"mcpServers": {
"workflowy": {
"command": "node",
"args": [
"/path/to/workflowy-local-mcp/dist-mcp/server.js"
],
"env": {
"WORKFLOWY_API_KEY": "your-api-key"
}
}
}
}