Sponsored by Deepsite.site

mcp-pkm-logseq MCP server

Created By
ruliana8 months ago
A fairly customizable MCP server for Logseq
Content

mcp-pkm-logseq MCP server

A MCP server for interacting with your Logseq Personal Knowledge Management system using custom instructions

Components

Resources

  • logseq://guide - Initial instructions on how to interact with this knowledge base

Tools

  • get_personal_notes_instructions() - Get instructions on how to use the personal notes tool
  • get_personal_notes(topics, from_date, to_date) - Retrieve personal notes from Logseq that are tagged with the specified topics
  • get_todo_list(done, from_date, to_date) - Retrieve the todo list from Logseq

Configuration

The following environment variables can be configured:

  • LOGSEQ_API_KEY: API key for authenticating with Logseq (default: "this-is-my-logseq-mcp-token")
  • LOGSEQ_URL: URL where the Logseq HTTP API is running (default: "http://localhost:12315")

Quickstart

Install

Claude Desktop and Cursor

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Published Servers Configuration
"mcpServers": {
  "mcp-pkm-logseq": {
    "command": "uvx",
    "args": [
      "mcp-pkm-logseq"
    ],
    "env": {
      "LOGSEQ_API_TOKEN": "your-logseq-api-token",
      "LOGSEQ_URL": "http://localhost:12315"
    }
  }
}

Claude Code

claude mcp add mcp-pkm-logseq uvx mcp-pkm-logseq

Start Logseq server

Logseq's HTTP API is an interface that runs within your desktop Logseq application. When enabled, it starts a local HTTP server (default port 12315) that allows programmatic access to your Logseq knowledge base. The API supports querying pages and blocks, searching content, and potentially modifying content through authenticated requests.

To enable the Logseq HTTP API server:

  1. Open Logseq and go to Settings (upper right corner)
  2. Navigate to Advanced
  3. Enable "Developer mode"
  4. Enable "HTTP API Server"
  5. Set your API token (this should match the LOGSEQ_API_KEY value in the MCP server configuration)

For more detailed instructions, see: https://logseq-copilot.eindex.me/doc/setup

Create MCP PKM Logseq Page

Create a page named "MCP PKM Logseq" in your Logseq graph to serve as the guide for AI assistants. Add the following content:

  • Description of your tagging system (e.g., which tags represent projects, areas, resources)
  • List of frequently used tags and what topics they cover
  • Common workflows you use to organize information
  • Naming conventions for pages and blocks
  • Instructions on how you prefer information to be retrieved
  • Examples of useful topic combinations for searching
  • Any context about your personal knowledge management approach

This page will be displayed whenever the AI thinks it needs to understand the user.

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory /Users/ronie/MCP/mcp-pkm-logseq run mcp-pkm-logseq

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

Add Development Servers Configuration to Claude Desktop

"mcpServers": {
  "mcp-pkm-logseq": {
    "command": "uv",
    "args": [
      "--directory",
      "/<parent-directories>/mcp-pkm-logseq",
      "run",
      "mcp-pkm-logseq"
    ],
    "env": {
      "LOGSEQ_API_TOKEN": "your-logseq-api-token",
      "LOGSEQ_URL": "http://localhost:12315"
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Amap Maps高德地图官方 MCP Server
Playwright McpPlaywright MCP server
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
CursorThe AI Code Editor
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
WindsurfThe new purpose-built IDE to harness magic
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Tavily Mcp
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
DeepChatYour AI Partner on Desktop
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Serper MCP ServerA Serper MCP Server
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
ChatWiseThe second fastest AI chatbot™