Sponsored by Deepsite.site

mcp_llm_inferencer

Created By
Sumedh15998 months ago
Uses Claude or OpenAI API to convert prompt-mapped input into concrete MCP server components such as tools, resource templates, and prompt handlers.
Content

mcp_llm_inferencer

Introduction

The mcp_llm_inferencer is an open-source library designed to leverage the power of Large Language Models (LLMs) such as Claude and OpenAI's GPT to convert prompt-mapped inputs into concrete components for MCP servers. These components include tools, resource templates, and prompt handlers, making it a versatile tool for developers working with MCP server environments.

Features

  • LLM Call Engine: Efficiently calls LLMs with built-in retry and fallback logic to ensure reliable responses.
  • Interchangeable Claude & OpenAI Support: Seamlessly switch between Claude and OpenAI APIs based on your preference or availability.
  • Streaming Support for Claude Desktop: Stream responses directly from Claude Desktop, providing real-time feedback.
  • Tool and Resource Response Validation: Ensures that the generated tools and resources meet predefined criteria before deployment.
  • Structured Output Bundling: Organizes output into structured bundles per component, simplifying integration and use.

Installation Instructions

Prerequisites

  • Python 3.6 or higher
  • An API key from Claude or OpenAI

Installing mcp_llm_inferencer

  1. Clone the repository:

    git clone https://github.com/your-repo/mcp_llm_inferencer.git
    cd mcp_llm_inferencer
    
  2. Install the package using pip:

    pip install .
    
  3. Set up your API keys as environment variables:

    • For Claude:
      export CLAUDE_API_KEY='your-claude-api-key'
      
    • For OpenAI:
      export OPENAI_API_KEY='your-openai-api-key'
      

Usage Examples

Basic Example

Here is a simple example demonstrating how to use mcp_llm_inferencer with the OpenAI API:

from mcp_llm_inferencer import MCPInferencer

# Initialize the inferencer with OpenAI API
inferencer = MCPInferencer(api_type='openai')

# Define your prompt
prompt = "Generate a tool to extract emails from text."

# Generate components
components = inferencer.generate_components(prompt)

print(components)

Advanced Example with Claude

This example shows how to use the library with Claude API and handle streaming responses:

from mcp_llm_inferencer import MCPInferencer

# Initialize the inferencer with Claude API
inferencer = MCPInferencer(api_type='claude', stream=True)

# Define your prompt
prompt = "Create a resource template for an S3 bucket."

# Generate components with streaming support
for component in inferencer.generate_components(prompt):
    print(component)

API Documentation

Class: MCPInferencer

Initialization

MCPInferencer(api_type, api_key=None, stream=False)
  • api_type (str): The type of LLM API to use ('claude' or 'openai').
  • api_key (str, optional): The API key for the specified LLM. If not provided, it will attempt to read from environment variables.
  • stream (bool, optional): Enable streaming support for Claude Desktop.

Methods

  • generate_components(prompt)
    • Generates MCP server components based on the given prompt.
    • prompt (str): The input prompt to be sent to the LLM.
    • Returns: A dictionary or stream of dictionaries containing generated components.

Example Method Usage

components = inferencer.generate_components("Generate a tool for sentiment analysis.")

License

mcp_llm_inferencer is released under the MIT License. See the LICENSE file for more details.


Feel free to contribute to mcp_llm_inferencer by submitting issues or pull requests on our GitHub repository.

⚠️ Development Status

This library is currently in early development. Some tests may be failing with the following issues:

Contributions to fix these issues are welcome! Please submit a pull request if you have a solution.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
WindsurfThe new purpose-built IDE to harness magic
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
DeepChatYour AI Partner on Desktop
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
CursorThe AI Code Editor
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Playwright McpPlaywright MCP server
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
ChatWiseThe second fastest AI chatbot™
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Amap Maps高德地图官方 MCP Server
Tavily Mcp
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Serper MCP ServerA Serper MCP Server
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.