Sponsored by Deepsite.site

Mpc Mcp Server

Created By
ronviers24 days ago
a four-valued logic in which the truth values V = {c, s, k, r} represent the canonical regimes of the constraint energy landscape
Overview

MPC — Metastable Propositional Calculus

Metastable Propositional Calculus (MPC) engine. Analyzes the thermodynamic and physical feasibility of logical assertions across multi-step reasoning. Unlike standard Boolean logic, MPC detects epistemic drift and structural conflicts (k-states) by calculating the energetic holding costs of maintaining premises over time. Use this to rigorously verify whether a complex sequence of claims can be logically maintained together without collapsing into contradiction.

Truth values: c (committed) · s (suspended) · k (conflict) · r (reset)


What's new in v0.3

FeatureDetail
FastMCPReplaced low-level MCP server with FastMCP for cleaner tool definitions and better client compatibility
Five-provider routingAnthropic · Google · OpenAI · Kimi (Moonshot AI) · Ollama — all first-class
Dynamic model listingReal-time queries to each provider's list-models endpoint; 5-minute TTL cache
Zero env-var conflictsEach provider has its own variable: ANTHROPIC_API_KEY, GOOGLE_API_KEY, OPENAI_API_KEY, KIMI_API_KEY, OLLAMA_HOST
Provider tab in UISide-by-side key configuration, per-provider status, model counts
list_available_models toolNew MCP tool for dynamic model discovery
Retry logicTransient errors retried up to 3× with exponential back-off
/status + /env endpointsProvider health and env-var audit without exposing key values

Providers

ProviderSDKKey Env VarNotes
AnthropicanthropicANTHROPIC_API_KEYClaude claude-opus-4-6, claude-sonnet-4-6, Haiku
Googlegoogle-generativeaiGOOGLE_API_KEYGemini 2.5 Pro, 2.0 Flash, 1.5
OpenAIopenaiOPENAI_API_KEYGPT-4o, o1, o3-mini, etc.
Kimiopenai (OpenAI-compatible)KIMI_API_KEYMoonshot AI · moonshot.cn
Ollamastdlib urllibOLLAMA_HOSTLocal models, no key needed

API keys are resolved in order: explicit argument → environment variable → .env file. No provider ever reads another provider's key variable.


Install

# Core — all five providers included
pip install -e .

# Add QuTiP for exact partition function (Ising Hamiltonian)
pip install qutip

# Add NetKet for spin-glass ground-state solver (requires JAX)
pip install "jax[cpu]" netket

# Full stack
pip install -e ".[full]"

Requires Python ≥ 3.11.


Quick start

Set API keys

# Any combination — only set what you have
export ANTHROPIC_API_KEY=sk-ant-…
export GOOGLE_API_KEY=AIza…
export OPENAI_API_KEY=sk-…
export KIMI_API_KEY=sk-…          # Moonshot AI key
export OLLAMA_HOST=http://localhost:11434   # default; omit if standard

Or use the Providers tab in the browser UI to enter and persist keys to .env.

Start the server

mpc-server

Starts:

  • MCP server on stdio (register in Claude Desktop or any MCP client)
  • Browser UI on http://localhost:7771

Claude Desktop configuration

{
  "mcpServers": {
    "mpc": {
      "command": "mpc-server",
      "env": {
        "ANTHROPIC_API_KEY": "sk-ant-…",
        "GOOGLE_API_KEY": "AIza…",
        "OPENAI_API_KEY": "sk-…",
        "KIMI_API_KEY": "sk-…"
      }
    }
  }
}

MCP Tools

ToolDescriptionAPI call?
compile_textFull MPC analysis: hypotheses, phases, frustration matrix, free energy (QuTiP), spin-glass ground state, Theorem 6.1 bound
compile_sequenceMulti-step trace with Entity Ledger — tracks epistemic drift and η_i accumulation
read_claimsPer-claim phase assignment (c/s/k/r) with rationale
budget_estimateTheorem 6.1 N_max = O(√(2E*/αε_min d_avg)) — pure arithmetic
list_available_modelsReal-time model catalogue across all five providers✗ (cached)

All tools accept provider_api_key as an optional parameter; omitting it reads from environment variables automatically.


Python API

import mpc_core

# Full analysis — Anthropic (default)
result = mpc_core.compile("Your text here…", api_key="sk-ant-…")

# Google Gemini
result = mpc_core.compile("…", model="gemini-2.0-flash", api_key="AIza…")

# OpenAI
result = mpc_core.compile("…", model="gpt-4o", api_key="sk-…")

# Kimi (Moonshot AI)
result = mpc_core.compile("…", model="moonshot-v1-32k", api_key="sk-…")

# Local Ollama (no key needed)
result = mpc_core.compile("…", model="llama3:8b")

print(result.energy_model.free_energy)      # F = -kT ln Z
print(result.ground_state.energy)           # Ising ground-state energy
print(result.ground_state.stable_ids)       # most compatible hypothesis subset
print(result.analytical_summary)

# Per-claim phase assignment
phases = mpc_core.read_claims(
    ["All ravens are black.", "Some ravens are albino."],
    api_key="sk-ant-…",
    model="claude-sonnet-4-6",
)

# Budget theorem (no API call)
est = mpc_core.budget_estimate(N=8, d_avg=2.5, epsilon_min=1.2)
print(est.interpretation)

# Multi-step trace with Entity Ledger
seq = mpc_core.compile_sequence(
    ["Step 1 text…", "Step 2 text…", "Step 3 text…"],
    api_key="sk-ant-…",
)

# Dynamic model listing
from mpc_core.providers import list_models, ProviderID
models = list_models(ProviderID.ANTHROPIC, "sk-ant-…")

# Free-energy surface (no API call)
from mpc_core.thermodynamics import free_energy_surface
surface = free_energy_surface(my_epsilon_matrix, T_range=(0.2,5), E_star_range=(2,40))
# surface["F"][T_index][E_star_index] → F value

Browser UI — tabs

TabDescription
Analyse textFull MPC analysis with hypothesis cards, thermodynamic strip (Z, F, S), ground-state box
Read claimsOne claim per line → instant phase assignment
3D Free EnergyInteractive Plotly F(T, E*) surface with N_max Theorem 6.1 contour
Energy landscapeAnimated 2-D canvas with budget/temperature sliders
Compatibility matrixPairwise ε_ij frustration table
Budget calculatorTheorem 6.1 N_max — no API key needed
Historical Heatmapη_i accumulation across a reasoning trace (Addendum V)
ProvidersAPI key configuration, per-provider status and model counts

HTTP API (localhost:7771)

MethodPathDescription
GET/Browser UI (index.html)
GET/modelsModel catalogue using env-var keys
POST/modelsRefresh catalogue with caller-supplied keys
GET/envWhich env vars are set (values redacted)
POST/statusProvider connectivity health check
POST/setenvPersist a key to .env
POST/MPC actions (compile, compile_sequence, read_claims, budget_estimate, free_energy_surface, ground_state)

Testing

# Arithmetic tests (no API key required)
pytest tests/ -v

# Live API tests — any provider combination
ANTHROPIC_API_KEY=sk-ant-…  pytest tests/ -v
GOOGLE_API_KEY=AIza…        pytest tests/ -v
OPENAI_API_KEY=sk-…         pytest tests/ -v
KIMI_API_KEY=sk-…           pytest tests/ -v

Architecture

mpc_core/
  providers.py     NEW v0.3 — ProviderID enum, per-provider key resolution,
                   dynamic model listing with TTL cache, all_models_catalogue()
  router.py        REWRITTEN — five-backend dispatch, retry logic, auth-error fast-fail
  compiler.py      compile() · read_claims() · budget_estimate() · compile_sequence()
  entity_ledger.py Cross-step entity registry (four-layer pipeline)
  thermodynamics.py QuTiP partition function · NetKet spin-glass · free_energy_surface()
  json_repair.py   Best-effort repair of truncated LLM JSON
  models.py        MPCResult dataclass hierarchy

mpc_server/
  server.py        REWRITTEN — FastMCP application + HTTP UI proxy (port 7771)
                   New endpoints: GET /env, POST /status, POST /models

static/
  index.html       REWRITTEN — provider selector tabs, per-provider key inputs,
                   dynamic model dropdowns, /status integration

Roadmap

  • v0.1 ✓ Core compiler, MCP server, reference UI
  • v0.2 ✓ Multi-backend routing · QuTiP · NetKet · 3D Plotly · Historical heatmap
  • v0.3 ✓ FastMCP · Five providers · Dynamic model listing · Provider UI · Retry logic
  • v0.4 Spectral Laplacian extension of Theorem 6.1, community-aware N_max bounds
  • v0.5 Streaming analysis; differential η_i display per hypothesis per step

License

MIT

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Playwright McpPlaywright MCP server
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
DeepChatYour AI Partner on Desktop
Serper MCP ServerA Serper MCP Server
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
CursorThe AI Code Editor
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Tavily Mcp
Amap Maps高德地图官方 MCP Server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
RedisA Model Context Protocol server that provides access to Redis databases. This server enables LLMs to interact with Redis key-value stores through a set of standardized tools.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
WindsurfThe new purpose-built IDE to harness magic
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
ChatWiseThe second fastest AI chatbot™