Sponsored by Deepsite.site

This section was written completely by using the SentinelCore agent and its tools(prompt->get the details from internet->write it to a file) via gemini 2.0 flash.

Created By
bhuvanmdev10 months ago
SentinelCore is an advanced AI agent powered by Model Context Protocol. It can browse the web, interact with local file systems, and is designed to keep evolving with new features. Whether you're looking for a smart assistant, a system manager, or a knowledge guide, SentinelCore adapts to your needs seamlessly.
Content

This section was written completely by using the SentinelCore agent and its tools(prompt->get the details from internet->write it to a file) via gemini 2.0 flash.

USER-PROMPT:-Now first search for a github account named bhuvanmdev and scrape his fontpage and search for a repo that has something to do with MCP application. Then go to that repository and and scrape the client.py and server.py file contents and create a neat summary of it and write it to readme.md file in the current dir.

Sentinel Core Agent Summary

This document provides a summary of the client.py and server.py files from this repo.

Client.py

The client.py file manages the interaction between a user, an LLM (Language Model), and various tools. It initializes and orchestrates a chat session where the LLM can use tools to answer user queries. The client handles server connections, tool execution with retries, and communication with the LLM provider.

Key Components:

  • Configuration: Loads environment variables (including the LLM API key) and server configurations from a JSON file.
  • Server: Manages connections to MCP (Microservice Communication Protocol) servers, lists available tools, executes tools with a retry mechanism, and cleans up resources.
  • Tool: Represents a tool with its properties (name, description, input schema) and provides a method to format the tool information for the LLM.
  • LLMClient: Manages communication with the LLM, using either Azure OpenAI or Google Gemini models.
  • ChatSession: Orchestrates the interaction between the user, LLM, and tools. It processes LLM responses, executes tools if needed, and handles the main chat session loop.

The client sets up a system message for the LLM, providing it with available tools and instructions on how to use them. It then enters a loop where it takes user input, sends it to the LLM, and processes the LLM's response. If the response contains a tool call, the client executes the tool and sends the result back to the LLM. This process continues until the user exits the chat session.

Server.py

The server.py file implements an MCP server with various tools, including file system operations, web scraping, and AI-powered search. It uses the fastmcp library to create and run the server.

Key Tools:

  • is_file_folder_present: Checks if a file or folder exists in the file system.
  • cur_datetimetime: Returns the current date and time.
  • browser_ai_search: Searches the web using an AI agent (Brave Search) and returns the response.
  • read_file: Reads the content of a file.
  • write_file: Writes content to a file (both text and binary).
  • web_page_scrapper: Scrapes a webpage and returns the content in markdown format. It can also index the content for vector-based search.
  • get_all_vector_indexes: Retrieves all vector embedding indexes in the current directory.
  • search_via_index: Searches a query via a vector embedding index.

The server also initializes an asynchronous web crawler (crawl4ai) and sets up an embedding model for vectorizing content. The server's lifespan is managed using an asynchronous context manager to ensure proper startup and shutdown of the crawler.

The server starts the MCP server loop using mcp.run(transport="stdio"), allowing it to receive and process tool calls from the client.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Serper MCP ServerA Serper MCP Server
Playwright McpPlaywright MCP server
ChatWiseThe second fastest AI chatbot™
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
CursorThe AI Code Editor
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
DeepChatYour AI Partner on Desktop
Tavily Mcp
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
WindsurfThe new purpose-built IDE to harness magic
Amap Maps高德地图官方 MCP Server
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.