Sponsored by Deepsite.site

Otter Bridge

Created By
Alex Pang8 months ago
OtterBridge is a lightweight, flexible server for connecting applications to various Large Language Model providers. Following the principles of simplicity and composability outlined in Anthropic's guide to building effective agents, OtterBridge provides a clean interface to LLMs while maintaining adaptability for different use cases. Currently supporting Ollama, with planned expansions to support other providers like ChatGPT and Claude.
Content

File Structure

├── .env.example           # Example environment variables
├── .gitignore             # Files to exclude from git
├── LICENSE                # Open source license (MIT, Apache, etc.)
├── README.md              # Project documentation
├── server.py              # MCP server implementation (previously fastmcp_server.py)
├── requirements.txt       # Python dependencies
└── src/                   # Source code directory
    ├── __init__.py        # Package initialization
    └── services/          # Services
        ├── __init__.py
        └── ollama.py      # Ollama service

OtterBridge

OtterBridge Logo

OtterBridge is a lightweight, flexible server for connecting applications to various Large Language Model providers. Following the principles of simplicity and composability outlined in Anthropic's guide to building effective agents, OtterBridge provides a clean interface to LLMs while maintaining adaptability for different use cases.

Currently supporting Ollama, with planned expansions to support other providers like ChatGPT and Claude.

Features

  • Provider-Agnostic: Designed to work with multiple LLM providers (currently Ollama, with ChatGPT and Claude coming soon)
  • Simple, Composable Design: Following best practices for LLM agent architecture
  • Lightweight Server: Built with FastMCP for reliable, efficient server implementation
  • Model Management: Easy access to model information and capabilities

Why "OtterBridge"?

Like otters who build connections between riverbanks, OtterBridge creates seamless pathways between your applications and various LLM providers. Just as otters are adaptable and resourceful, OtterBridge adapts to different LLM backends while providing consistent interfaces.

Prerequisites

Before installing OtterBridge, you need to have:

  1. Ollama installed and running on the default port
  2. uv installed for Python package management

Installation

  1. Clone this repository:
git clone https://github.com/yourusername/otterbridge.git
cd otterbridge
  1. Install dependencies using uv:
uv add -r requirements.txt
  1. Create a .env file based on the provided .env.example:
cp .env.example .env
  1. Configure your environment variables in the .env file.

Claude Desktop Integration

For Claude Desktop users, you'll need to add OtterBridge to your Claude Desktop configuration:

  1. Open your Claude Desktop config file
  2. Add the following configuration (adjust the path to match your local installation):
"otterbridge": {
    "command": "uv",
    "args": [
        "--directory",
        "C:\\Path\\To\\Your\\otterbridge",
        "run",
        "server.py"
    ]
}

Usage

Starting the Server

OtterBridge can be started in two ways:

  1. Manual start for testing purposes: :**
uv run server.py
  1. Automatic start with MCP clients:
    • When using compatible MCP clients like Claude Desktop, OtterBridge will start automatically when needed

Available Tools

OtterBridge exposes the following tools via the Model Context Protocol (MCP):

  • chat: Send messages to LLMs and get AI-generated responses
  • list_models: Retrieve information about available language models

Tool Usage Examples

List Available Models

Example response:

{
    "status": "connected",
    "server_status": "online",
    "available_models": ["llama3", "llama3.1:8b", "codellama", "llama3.3", "qwen2.5"],
    "available_models_count": 5,
    "message": "Successfully retrieved available Ollama models"
}

Chat Completion

Example response:

{
    "role": "assistant",
    "content": "I'm doing well, thank you for asking! I'm here and ready to help you with any questions or tasks you might have. How can I assist you today?",
    "model": "llama3:latest"
}

Configuration

OtterBridge can be configured using environment variables:

VariableDescriptionDefault
OLLAMA_BASE_URLURL of the Ollama serverhttp://localhost:11434
DEFAULT_MODELDefault model to usellama3.3

Roadmap

  • Q2 2025: Support for ChatGPT API integration
  • Q3 2025: Support for Claude API integration

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Development Guidelines

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License.

Acknowledgements

otterbridge

OtterBridge is a lightweight, mcp server for connecting applications to various Large Language Model providers.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Amap Maps高德地图官方 MCP Server
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
ChatWiseThe second fastest AI chatbot™
CursorThe AI Code Editor
Tavily Mcp
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Playwright McpPlaywright MCP server
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Serper MCP ServerA Serper MCP Server
DeepChatYour AI Partner on Desktop
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
WindsurfThe new purpose-built IDE to harness magic
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.