Sponsored by Deepsite.site

项目介绍

Created By
xiaoxiaoningdesuia year ago
A lightweight large language model input-output tracker—launch with a single command, integrate effortlessly.
Overview

What is LLM-Tracker?

LLM-Tracker is a lightweight input-output tracker for large language models, designed to record interactions between applications and models during development.

How to use LLM-Tracker?

To use LLM-Tracker, compile the project using Go, and launch it with the command ./llm-tracker.exe --configFile=config.toml. Ensure you have the necessary configuration files in place.

Key features of LLM-Tracker?

  • Hides the tools list in input to reduce clutter during interactions.
  • Supports both streaming and non-streaming responses, allowing for efficient processing.
  • Compatible with various modes including mcp/function_call/chat.
  • Escapes special characters in input and output for better readability.

Use cases of LLM-Tracker?

  1. Tracking input-output interactions for large language models in real-time.
  2. Integrating with applications that utilize models like ollama and deepseek.
  3. Debugging and optimizing model interactions by analyzing recorded data.

FAQ from LLM-Tracker?

  • What models does LLM-Tracker support?

LLM-Tracker currently supports models deployed with ollama, deepseek, and theoretically any model compatible with OpenAI SDK.

  • How do I integrate LLM-Tracker with my application?

Change the IP and port in your workflow/client code to 127.0.0.1:1234 to connect to LLM-Tracker.

  • Is LLM-Tracker free to use?

Yes! LLM-Tracker is open-source and available under the MIT license.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Tavily Mcp
WindsurfThe new purpose-built IDE to harness magic
RedisA Model Context Protocol server that provides access to Redis databases. This server enables LLMs to interact with Redis key-value stores through a set of standardized tools.
Serper MCP ServerA Serper MCP Server
Playwright McpPlaywright MCP server
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
ChatWiseThe second fastest AI chatbot™
CursorThe AI Code Editor
DeepChatYour AI Partner on Desktop
Amap Maps高德地图官方 MCP Server
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs