Sponsored by Deepsite.site

Shinkai Apps

Created By
dcSpark7 months ago
Shinkai is a two click install AI manager (Local and Remote) that allows you to create AI agents in 5 minutes or less using a simple UI. Agents and tools are exposed as an MCP Server.
Content


Shinkai Apps

GitHub stars Discord Twitter Follow

Shinkai is a two-click install AI manager (local and remote) that lets you spin up AI agents in minutes through a friendly UI. Agents and tools are all exposed via an MCP server.

A companion repository, Shinkai Node, provides the core services for agent management, job processing and secure communications.

Key Features

  • Rapid Agent Setup – create and configure agents in under five minutes with a guided UI.
  • Local or Remote – run everything on your machine or connect to a remote Shinkai Node.
  • MCP Server Integration – expose agents and tools over an MCP server for easy automation.

Demo

https://github.com/user-attachments/assets/bc5bb7da-7ca5-477d-838a-8239951b6c01

Documentation

General Documentation: https://docs.shinkai.com

Repository Structure

Apps

  • shinkai-desktop – cross-platform desktop UI (can also run in the browser).

Libs

  • shinkai-message-ts – message definitions and network helpers for talking to Shinkai Node.
  • shinkai-node-state – React Query based state management for node data.
  • shinkai-ui – reusable React components used across the apps.
  • shinkai-artifacts – styled UI primitives built on top of Radix and Tailwind.
  • shinkai-i18n – translation utilities powered by i18next.

Getting started

To get started first clone this repo:

$ git clone https://github.com/dcSpark/shinkai-apps

Download side binaries:

Macos

ARCH="aarch64-apple-darwin" \
SHINKAI_NODE_VERSION="v1.0.10" \
OLLAMA_VERSION="v0.7.1" \
npx ts-node ./ci-scripts/download-side-binaries.ts

Linux

ARCH="x86_64-unknown-linux-gnu" \
OLLAMA_VERSION="v0.7.1" \
SHINKAI_NODE_VERSION="v1.0.10" \
npx ts-node ./ci-scripts/download-side-binaries.ts

Windows

$ENV:OLLAMA_VERSION="v0.7.1";
$ENV:SHINKAI_NODE_VERSION="v1.0.10";
$ENV:ARCH="x86_64-pc-windows-msvc";
npx ts-node ./ci-scripts/download-side-binaries.ts

Run one of the projects

Once you have done that simply use npm to compile/serve it yourself:

cd shinkai-apps
nvm use
npm ci
npx nx serve {project-name} # IE: npx nx serve shinkai-desktop

Project specific configurations

  • shinkai-desktop – for development and building purposes
    • Run as a Desktop App using Vite: Run npx nx serve:tauri shinkai-desktop and it will automatically launch the Shinkai Desktop application.
    • Run as a Web App: Run npx nx serve shinkai-desktop and open a browser and navigate to http://localhost:1420.

Useful Commands

Every command, if it's needed, build projects and it's dependencies according to the project dependency tree inferred from imports between them.

  • Run a single task

    Command: npx nx [target] [project-name]

    Params:

    • target: build | serve | lint | test | e2e

    IE:

    • npx nx build shinkai-desktop
    • npx nx lint shinkai-message-ts
    • npx nx test shinkai-ui
    • npx nx serve shinkai-desktop
  • Run many tasks

    Command: npx nx run-many --target=[target]

    Params:

    • target: build | serve | lint | test | e2e

    IE:

    • npx nx run-many --target=build
    • npx nx run-many --target=lint
    • npx nx run-many --target=test
    • npx nx run-many --target=e2e
    • npx nx run-many --target=serve
  • Run on affected projects

    Command: npx nx affected --target=[target]

    Params:

    • target: build | serve | lint | test | e2e

    IE:

    • npx nx affected --target=build

When you build a project, NX builds a cache (to make it faster), if you want to skip it just add the parameter --skip-nx-cache to the previous commands.

  • Create a dev build

    • NODE_OPTIONS="--max_old_space_size=8192" npx nx build shinkai-desktop --config="./src-tauri/tauri.conf.development.json"
  • Update ollama models repository

    • npx ts-node ./ci-scripts/generate-ollama-models-repository.ts

Dev conventions

Monorepo

To orchestrate all the tasks, dependencies and hierarchy between different projects, this repository uses NX as a monorepo tooling.

Third party dependencies

All projects share the same base of dependencies defined ./package.json file found in the root of the repository. Nested package json files are used just to override or extends base attributes.

UI Libraries

To build the UI there are 3 core libraries:

  • radix to have base unstyled components.
  • shadcn to obtain ready to use components.
  • tailwindcss to implement css customizations, structures, layouts and helpers.

State management

To implement state management there are two different libraries:

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
DeepChatYour AI Partner on Desktop
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
CursorThe AI Code Editor
Serper MCP ServerA Serper MCP Server
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
ChatWiseThe second fastest AI chatbot™
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Playwright McpPlaywright MCP server
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Tavily Mcp
WindsurfThe new purpose-built IDE to harness magic
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Amap Maps高德地图官方 MCP Server
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.