Sponsored by Deepsite.site

AskTheApi Team Builder

Created By
AskTheApi7 months ago
Agent network builder for communicating with openapi apis. Based in autogen
Content

AskTheApi Team Builder

PyPI version License: MIT Python Versions

A high-level Python library for building and managing networks of autonomous agents that collaborate to solve complex tasks. It's designed to work seamlessly with APIs defined using the OpenAPI standard. The library provides a clean, type-safe interface for creating, configuring, and running teams of agents, making it easy to orchestrate multi-agent workflows with minimal boilerplate.

Features

  • 🚀 Effortless Agent Network Creation: Quickly build agent networks with custom tools and capabilities based on OpenAPI specifications.

  • 🤝 Team-Based Collaboration: Easily define agent teams with automatic coordination handled by a built-in planning agent.

  • 📡 Streaming Interactions: Stream agent communication in real-time for more dynamic and responsive workflows.

  • 🔧 Built-in HTTP Client: Simplify tool implementation with an integrated HTTP client ready to call external APIs.

  • ✨ Type Safety with Pydantic: Leverage Pydantic models for robust data validation and clear type definitions.

  • 🎯 Clean and Intuitive API: Designed for developers—minimal boilerplate, maximum clarity.

Installation

pip install asktheapi-team-builder

Quick Start

Here's how to use the package:

1. Create agents from OpenAPI spec

from asktheapi_team_builder import TeamBuilder, Agent, Tool, Message, APISpecHandler
from typing import List
async def create_agents_from_spec():
    # Initialize handlers
    api_spec_handler = APISpecHandler()
    
    # Download and parse OpenAPI spec
    spec_content = await api_spec_handler.download_url_spec("https://api.example.com/openapi.json")
    
    # Classify endpoints into logical groups
    classification_result = await api_spec_handler.classify_spec(
        spec_content
    )
    
    # Generate agents for each group
    agents = []
    for group_spec in classification_result.specs:
        agent_result = await api_spec_handler.generate_agent_for_group(
            group_spec,
            spec_content
        )
        agents.append(agent_result)
    
    return agents

2. Build and run a team

async def run_agent_team(agents: List[Agent], query: str):
    # Initialize team builder
    team_builder = TeamBuilder(
        model="gpt-4",
        model_config={"temperature": 0.7}
    )
    
    # Build the team
    team = await team_builder.build_team(agents)
    
    # Create messages
    messages = [
        Message(
            role="user",
            content=query
        )
    ]
    
    # Run the team with streaming
    async for event in team_builder.run_team(team, messages, stream=True):
        if isinstance(event, ChatMessage):
            print(f"{event.source}: {event.content}")

Example usage

async def main():
    # Create agents from spec
    api_agents = await create_agents_from_spec()
    
    # Combine with manual agents
    all_agents = [weather_agent] + api_agents
    
    # Run the team
    await run_agent_team(
        all_agents,
        "What's the weather like in London and how might it affect local businesses?"
    )

Custom Headers and Configuration

You can configure the team builder with custom headers and model settings:

team_builder = TeamBuilder(
    model="gpt-4",
    model_config={
        "temperature": 0.7,
        "default_headers": {
            "Authorization": "Bearer your-token",
            "Custom-Header": "custom-value"
        }
    }
)

# Run team with extra headers for specific requests
team = await team_builder.build_team(agents)
result = await team_builder.run_team(
    team,
    messages,
    extra_headers={"Request-ID": "123"}
)

MCP (Model Context Protocol) Support

The library includes built-in support for Model Context Protocol, allowing you to expose your agent teams as API endpoints with automatic tool generation from OpenAPI specifications.

from asktheapi_team_builder import MCPService, MCPConfig

# Configure MCP service
mcp_config = MCPConfig(
    transport="sse",  # Server-Sent Events transport
    port=8000,        # Port to run the MCP server
    name="asktheapi_mcp"  # Service name
)

# Initialize MCP service
mcp_service = MCPService(mcp_config)

# Start MCP server with OpenAPI spec
await mcp_service.start_from_spec(
    url_spec="https://api.example.com/openapi.json",
    headers={"Authorization": "Bearer your-token"}
)

The MCP service will:

  • Automatically download and parse the OpenAPI specification
  • Classify endpoints into logical groups
  • Generate appropriate tools for each group
  • Expose these tools through a Model Control Protocol interface
  • Handle real-time streaming of agent interactions

Contributing

Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Development Setup

# Clone the repository
git clone https://github.com/alexalbala/asktheapi-team-builder.git
cd asktheapi-team-builder

# Install dependencies
pip install -e ".[dev]"

# Run tests
pytest

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • Built on top of Microsoft's AutoGen
  • Inspired by the need for a higher-level interface for agent team management
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Playwright McpPlaywright MCP server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Amap Maps高德地图官方 MCP Server
Tavily Mcp
ChatWiseThe second fastest AI chatbot™
WindsurfThe new purpose-built IDE to harness magic
Serper MCP ServerA Serper MCP Server
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
DeepChatYour AI Partner on Desktop
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
CursorThe AI Code Editor
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.