Sponsored by Deepsite.site

Langfuse MCP (Model Context Protocol)

Created By
avivsinai6 months ago
A Model Context Protocol (MCP) server for Langfuse, enabling AI agents to query Langfuse trace data for enhanced debugging and observability
Content

Langfuse MCP (Model Context Protocol)

Test PyPI version Python 3.10+ License: MIT

This project provides a Model Context Protocol (MCP) server for Langfuse, allowing AI agents to query Langfuse trace data for better debugging and observability.

Quick Start with Cursor

Add to Cursor

Installation Options

🎯 From Cursor IDE: Click the button above (works seamlessly!)
🌐 From GitHub Web: Copy this deeplink and paste into your browser address bar:

cursor://anysphere.cursor-deeplink/mcp/install?name=langfuse-mcp&config=eyJjb21tYW5kIjoidXZ4IiwiYXJncyI6WyJsYW5nZnVzZS1tY3AiLCItLXB1YmxpYy1rZXkiLCJZT1VSX1BVQkxJQ19LRVkiLCItLXNlY3JldC1rZXkiLCJZT1VSX1NFQ1JFVF9LRVkiLCItLWhvc3QiLCJodHRwczovL2Nsb3VkLmxhbmdmdXNlLmNvbSJdfQ==

⚙️ Manual Setup: See Configuration section below

💡 Note: The "Add to Cursor" button only works from within Cursor IDE due to browser security restrictions on custom protocols (cursor://). This is normal and expected behavior per Cursor's documentation.

After installation: Replace YOUR_PUBLIC_KEY and YOUR_SECRET_KEY with your actual Langfuse credentials in Cursor's MCP settings.

Features

  • Integration with Langfuse for trace and observation data
  • Tool suite for AI agents to query trace data
  • Exception and error tracking capabilities
  • Session and user activity monitoring

Available Tools

The MCP server provides the following tools for AI agents:

  • fetch_traces - Find traces based on criteria like user ID, session ID, etc.
  • fetch_trace - Get a specific trace by ID
  • fetch_observations - Get observations filtered by type
  • fetch_observation - Get a specific observation by ID
  • fetch_sessions - List sessions in the current project
  • get_session_details - Get detailed information about a session
  • get_user_sessions - Get all sessions for a user
  • find_exceptions - Find exceptions and errors in traces
  • find_exceptions_in_file - Find exceptions in a specific file
  • get_exception_details - Get detailed information about an exception
  • get_error_count - Get the count of errors
  • get_data_schema - Get schema information for the data structures

Setup

Install uv

First, make sure uv is installed. For installation instructions, see the uv installation docs.

If you already have an older version of uv installed, you might need to update it with uv self update.

Installation

uv pip install langfuse-mcp

Obtain Langfuse credentials

You'll need your Langfuse credentials:

Running the Server

Run the server using uvx:

uvx langfuse-mcp --public-key YOUR_KEY --secret-key YOUR_SECRET --host https://cloud.langfuse.com

Configuration with MCP clients

Configure for Cursor

Create a .cursor/mcp.json file in your project root:

{
  "mcpServers": {
    "langfuse": {
      "command": "uvx",
      "args": ["langfuse-mcp", "--public-key", "YOUR_KEY", "--secret-key", "YOUR_SECRET", "--host", "https://cloud.langfuse.com"]
    }
  }
}

Configure for Claude Desktop

Add to your Claude settings:

{
  "command": ["uvx"],
  "args": ["langfuse-mcp"],
  "type": "stdio",
  "env": {
    "LANGFUSE_PUBLIC_KEY": "YOUR_KEY",
    "LANGFUSE_SECRET_KEY": "YOUR_SECRET",
    "LANGFUSE_HOST": "https://cloud.langfuse.com"
  }
}

Output Modes

Each tool supports different output modes to control the level of detail in responses:

  • compact (default): Returns a summary with large values truncated
  • full_json_string: Returns the complete data as a JSON string
  • full_json_file: Saves the complete data to a file and returns a summary with file information

Development

Clone the repository

git clone https://github.com/yourusername/langfuse-mcp.git
cd langfuse-mcp

Create a virtual environment and install dependencies

uv venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate
uv pip install -e ".[dev]"

Set up environment variables

export LANGFUSE_SECRET_KEY="your-secret-key"
export LANGFUSE_PUBLIC_KEY="your-public-key"
export LANGFUSE_HOST="https://cloud.langfuse.com"  # Or your self-hosted URL

Testing

To run the demo client:

uv run examples/langfuse_client_demo.py --public-key YOUR_PUBLIC_KEY --secret-key YOUR_SECRET_KEY

Or use the convenience wrapper:

uv run run_mcp.py

Version Management

This project uses dynamic versioning based on Git tags:

  1. The version is automatically determined from git tags using uv-dynamic-versioning
  2. To create a new release:
    • Tag your commit with git tag v0.1.2 (following semantic versioning)
    • Push the tag with git push --tags
    • Create a GitHub release from the tag
  3. The GitHub workflow will automatically build and publish the package with the correct version to PyPI

For a detailed history of changes, please see the CHANGELOG.md file.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Cache Management

We use the cachetools library to implement efficient caching with proper size limits:

  • Uses cachetools.LRUCache for better reliability
  • Configurable cache size via the CACHE_SIZE constant
  • Automatically evicts the least recently used items when caches exceed their size limits
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Amap Maps高德地图官方 MCP Server
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Tavily Mcp
CursorThe AI Code Editor
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Serper MCP ServerA Serper MCP Server
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
WindsurfThe new purpose-built IDE to harness magic
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
DeepChatYour AI Partner on Desktop
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
ChatWiseThe second fastest AI chatbot™
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Playwright McpPlaywright MCP server
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.