Sponsored by Deepsite.site

invinoveritas — Lightning Paid Ai Reasoning

Created By
babyblueviper1a month ago
Lightning-paid strategic reasoning and structured decision intelligence for autonomous agents. Pay per insight using Bitcoin Lightning (L402 protocol). No subscriptions, no accounts, no KYC. - /reason → High-quality strategic analysis - /decision → Structured JSON decisions with confidence and risk level Perfect for agents that need reliable, high-signal reasoning with deterministic cost.
Overview

⚡ invinoveritas

smithery badge Glama card Glama score License

Lightning-paid reasoning and decision intelligence for autonomous agents

Pay-per-insight API using the L402 protocol (Bitcoin Lightning).
No subscriptions. No accounts. No KYC.

Built primarily for autonomous AI agents that need high-quality reasoning and structured decisions without increasing local compute costs.

Live API: https://invinoveritas.onrender.com


Why Agents Use invinoveritas

Autonomous agents call invinoveritas when they need:

  • Strategic reasoning deeper than local models
  • Structured, machine-readable decisions with confidence scores
  • Deterministic cost per request (instead of unpredictable token usage)
  • Cryptographically verifiable payments via Lightning

Agents pay only when confidence matters — with sats, atomically, without intermediaries.


Quick Start for Agents

Works seamlessly with Claude Desktop, Cursor, and any MCP-compatible agent.

pip install mcp requests lndgrpc pyln-client
python mcp_server.py

Then add it to your agent configuration.

Option 2 — CLI Client

python agent_client.py --endpoint reason --question "What are the biggest risks for Bitcoin in 2026?"

python agent_client.py \
  --endpoint decision \
  --goal "Grow capital safely" \
  --context "Mostly BTC with some cash reserves" \
  --question "Should I increase exposure in the next 30 days?"

Option 3 — Direct HTTP

# 1. Check price
curl https://invinoveritas.onrender.com/price/reason

# 2. Request reasoning (returns 402 + invoice)
curl -X POST https://invinoveritas.onrender.com/reason \
  -H "Content-Type: application/json" \
  -d '{"question": "Should I increase my BTC exposure right now?"}'

# 3. After paying the invoice, retry with credentials
curl -X POST https://invinoveritas.onrender.com/reason \
  -H "Content-Type: application/json" \
  -H "Authorization: L402 <payment_hash>:<preimage>" \
  -d '{"question": "Should I increase my BTC exposure right now?"}'

Core Endpoints

EndpointPurposeOutput TypeTypical Cost
POST /reasonStrategic reasoningNatural language~500–700 sats
POST /decisionStructured decision intelligenceClean JSON~1000–1200 sats

Response Examples

/reason

{
  "status": "success",
  "type": "premium_reasoning",
  "answer": "..."
}

/decision

{
  "status": "success",
  "type": "decision_intelligence",
  "result": {
    "decision": "Increase exposure slightly",
    "confidence": 0.78,
    "reasoning": "Market structure improving while risk remains moderate.",
    "risk_level": "medium"
  }
}

Payment Flow (L402)

  1. POST to /reason or /decision → receive HTTP 402 with bolt11 invoice
  2. Pay the invoice with any Lightning wallet or node
  3. Retry the same request with:
    Authorization: L402 <payment_hash>:<preimage>
    
  4. Receive the AI response

The MCP server and agent_client.py handle the full flow automatically.


For Autonomous Agents

The recommended way is using the included mcp_server.py.

It exposes reason and decision as native MCP tools. Agents simply call the tool — payment is handled transparently in the background.

Also included:

  • agent_client.py — simple CLI for scripts
  • Freqtrade strategy example using /decision as a confidence gate

Discovery Endpoints (Always Free)

EndpointPurpose
GET /healthService status and metadata
GET /price/{endpoint}Current price in sats
GET /toolMachine-readable tool definition
GET /.well-known/ai-plugin.jsonStandard agent discovery
GET /docsInteractive API docs

Project Structure

invinoveritas/
├── app.py                 # Main API (deployed on Render)
├── ai.py                  # Reasoning and decision logic
├── config.py
├── node_bridge.py         # Talks to Lightning bridge
├── bridge.py              # LND bridge (runs on VPS)
├── mcp_server.py          # MCP server for Claude, Cursor, etc.
├── agent_client.py        # CLI client
├── examples/
│   └── invinoveritas_strategy.py
├── index.html
├── requirements.txt
└── README.md

Deployment

Main API — Render
Lightning Bridge — VPS running LND

Environment Variables (Render):

  • OPENAI_API_KEY
  • NODE_URL (your Lightning bridge URL)

Philosophy

Most AI agents are forced into monthly subscriptions or unpredictable token costs.

invinoveritas offers a better model:

Atomic intelligence purchases using Bitcoin Lightning.

Agents pay only when the decision truly matters — with sats, cryptographically, and without intermediaries.


Built for the Bitcoin × AI future. ⚡



Server Config

{
  "mcpServers": {
    "invinoveritas": {
      "command": "python",
      "args": [
        "/absolute/path/to/your/mcp_server.py"
      ],
      "env": {
        "API_BASE": "https://invinoveritas.onrender.com",
        "LND_DIR": "/root/.lnd"
      },
      "disabled": false,
      "alwaysAllow": [
        "reason",
        "decision"
      ]
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Playwright McpPlaywright MCP server
Tavily Mcp
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
RedisA Model Context Protocol server that provides access to Redis databases. This server enables LLMs to interact with Redis key-value stores through a set of standardized tools.
DeepChatYour AI Partner on Desktop
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Serper MCP ServerA Serper MCP Server
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
CursorThe AI Code Editor
WindsurfThe new purpose-built IDE to harness magic
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Amap Maps高德地图官方 MCP Server
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
ChatWiseThe second fastest AI chatbot™
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.