Sponsored by Deepsite.site

Ownrig - Ai Hardware Compatibility

Created By
OwnRiga month ago
AI hardware compatibility data for running LLMs locally. Query 50 models, 25 GPUs, 9 ready-to-buy machines, and 663 compatibility entries. Get hardware recommendations, check VRAM requirements, and find buy links.
Overview

ownrig-mcp

1.0.0 • Public • Published

OwnRig MCP Server

AI hardware compatibility data for any MCP-compatible assistant. Query 50 models, 25 devices, 14 builds, 9 ready-to-buy machines, and 663 compatibility entries.

Transport: stdio

Install

npm install -g ownrig-mcp

Or use directly with npx:

npx ownrig-mcp

Tools

ToolDescription
query_modelGet details for a specific AI model (VRAM, formats, use cases)
query_deviceGet specs for a GPU or Apple Silicon device
query_compatibilityCheck if a model runs on a device (tokens/sec, VRAM fit)
list_modelsList models with optional use_case / family filter
list_devicesList devices with optional type / min_vram filter
list_buildsList curated builds with optional tier / profile filter
list_systemsList ready-to-buy machines (Mac, Dell, ASUS) with optional brand / type filter
query_systemGet full details for a specific ready-to-buy machine
recommend_buildFull recommendation engine — 3 paths (model→hw, workflow→hw, hw→models)
find_models_for_device"What can I run on my RTX 4090?"
find_devices_for_model"What GPU do I need for Llama 3.1 70B?"
list_workflowsList workflow profiles (tools → hardware requirements)

Usage with Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "ownrig": {
      "command": "npx",
      "args": ["-y", "ownrig-mcp"]
    }
  }
}

Usage with Cursor

Add to .cursor/mcp.json in your project:

{
  "mcpServers": {
    "ownrig": {
      "command": "npx",
      "args": ["-y", "ownrig-mcp"]
    }
  }
}

Usage from source (development)

If you have the OwnRig repo cloned:

# From project root
npm install
npm run generate:rec-data
npm run mcp

The mcp script builds a self-contained bundle via esbuild (resolving all @/ path aliases) then runs it. Running tsx mcp-server/index.ts directly does not work because the engine uses TypeScript path aliases that tsx cannot resolve transitively across module boundaries.

For your MCP client config, point to the built bundle:

{
  "mcpServers": {
    "ownrig": {
      "command": "node",
      "args": ["mcp-server/dist/index.mjs"],
      "cwd": "/path/to/ownrig"
    }
  }
}

Example queries

Once connected, ask your AI assistant:

  • "What GPU do I need to run Llama 3.1 70B locally?"
  • "Can an RTX 4090 run Qwen 3 32B?"
  • "Recommend a build for running AI coding tools with Cursor"
  • "What models can I run on my M4 Max MacBook Pro?"
  • "Compare the Mac Studio M4 Ultra vs a custom build for AI"

Data

This package includes a snapshot of OwnRig's verified hardware compatibility data. The data is updated with each package release.

  • 50 AI models with VRAM requirements per quantization level
  • 25 GPUs and Apple Silicon devices with specs and pricing
  • 14 curated PC builds with component lists and benchmarks
  • 9 ready-to-buy machines (Mac, Dell, ASUS, NVIDIA)
  • 663 model × device × quantization compatibility entries
  • 7 workflow profiles mapping AI tools to hardware needs

Source: ownrig.com | License: MIT

Server Config

{
  "mcpServers": {
    "ownrig": {
      "command": "npx",
      "args": [
        "-y",
        "ownrig-mcp"
      ]
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
RedisA Model Context Protocol server that provides access to Redis databases. This server enables LLMs to interact with Redis key-value stores through a set of standardized tools.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
WindsurfThe new purpose-built IDE to harness magic
Serper MCP ServerA Serper MCP Server
ChatWiseThe second fastest AI chatbot™
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Playwright McpPlaywright MCP server
CursorThe AI Code Editor
DeepChatYour AI Partner on Desktop
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Amap Maps高德地图官方 MCP Server
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Tavily Mcp