Sponsored by Deepsite.site

Poe o3 MCP Server

Created By
Anansitrading8 months ago
A lightweight MCP server implementation for accessing OpenAI's o3 model via Poe API
Content

Poe o3 MCP Server

A lightweight Model Context Protocol (MCP) server implementation that provides access to OpenAI's o3 model and other models via Poe's API. This server allows you to integrate Poe's AI capabilities into any MCP-compatible application.

Features

  • Simple MCP server implementation using FastMCP
  • Direct integration with Poe's API to access the o3 model and other models
  • Model selection via command-line style flags in prompts
  • Asynchronous request handling for efficient processing
  • Comprehensive error handling and logging
  • Easy setup and configuration

Prerequisites

Installation

  1. Clone this repository:

    git clone https://github.com/Anansitrading/po3_MCP.git
    cd po3_MCP
    
  2. Create a virtual environment (recommended):

    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
    
  3. Install the required dependencies:

    pip install -r requirements.txt
    
  4. Set up your environment variables:

    cp sample.env .env
    
  5. Edit the .env file and add your Poe API key:

    POE_API_KEY=your_poe_api_key_here
    

Usage

Running the MCP Server

Run the server with:

python poe_o3_mcp_server.py

The server will start and listen for MCP protocol messages on standard input/output.

Model Selection via Flags

You can select different models available on Poe by adding a flag to your prompt:

--Claude-3.5-Sonnet Tell me about quantum computing

This will route your query to the Claude-3.5-Sonnet model instead of the default o3 model.

The flag can be placed anywhere in the message:

  • At the beginning: --GPT-4 What is the capital of France?
  • In the middle: Tell me --Claude-3-Opus about the history of Rome
  • At the end: What are the three laws of robotics? --Claude-3.5-Sonnet

The flag will be automatically removed from the message before it's sent to the model.

If no flag is specified, the server defaults to using the "o3" model.

Integrating with MCP Clients

This server provides two tools:

  1. o3_query - Send a query to the o3 model (or another model via flags) and get a response
  2. ping - A simple test tool that returns "pong"

Example of using the server with an MCP client:

from mcp.client import MCPClient

# Connect to the MCP server
client = MCPClient(server_command=["python", "path/to/poe_o3_mcp_server.py"])

# Call the o3_query tool with the default o3 model
response = client.call_tool("o3_query", {"message": "Tell me about quantum computing"})
print(response)

# Call the o3_query tool with a different model using a flag
response = client.call_tool("o3_query", {"message": "--Claude-3.5-Sonnet Tell me about quantum computing"})
print(response)

# Test the connection with ping
ping_response = client.call_tool("ping", {})
print(ping_response)  # Should print "pong"

You can also run the included example script:

python example.py

Configuration

The server uses the following environment variables:

  • POE_API_KEY: Your Poe API key (required)
  • LOG_LEVEL: Logging level (optional, defaults to DEBUG)

Troubleshooting

If you encounter issues:

  1. Check that your Poe API key is valid and correctly set in the .env file
  2. Ensure you have the correct dependencies installed
  3. Check the server logs for detailed error messages
  4. Verify that you have an active internet connection
  5. If using a model flag, make sure the model name is correct and available on Poe

License

MIT

Acknowledgements

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Amap Maps高德地图官方 MCP Server
WindsurfThe new purpose-built IDE to harness magic
DeepChatYour AI Partner on Desktop
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Serper MCP ServerA Serper MCP Server
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Tavily Mcp
Playwright McpPlaywright MCP server
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
CursorThe AI Code Editor
ChatWiseThe second fastest AI chatbot™
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs