Sponsored by Deepsite.site

Anthropic Model Context Protocol (MCP) Server with Ollama Integration

Created By
jorgesandoval7 months ago
Model Context Protocol (MCP) server integrated with an external inference service (e.g., Ollama/Gemma3) via middleware.
Content

Anthropic Model Context Protocol (MCP) Server with Ollama Integration

A hybrid architecture combining an Anthropic-compatible Model Context Protocol (MCP) server with Ollama/Gemma LLMs for inference. This implementation follows the official Model Context Protocol specification while using open-source models.

Overview

This project implements an Anthropic-compatible MCP server following the official MCP specification, along with middleware that handles communication between clients and Ollama's Gemma model.

Architecture

┌───────────┐     ┌──────────────┐     ┌────────────────────────┐
│  Client   │────▶│   Middleware │────▶│  Anthropic MCP Server  │
│           │◀────│              │◀────│  (Claude MCP Protocol) │
└───────────┘     └──────┬───────┘     └────────────────────────┘
                         │                          │
                         │                          │
                         ▼                          ▼
                  ┌─────────────┐         ┌────────────────────┐
                  │    Ollama   │         │   SQLite Database  │
                  │ (Gemma3:4b) │         │  (Context Storage) │
                  └─────────────┘         └────────────────────┘

Components

  • MCP Server: Implements the Anthropic MCP protocol on port 3000
  • Middleware: Handles communication between clients and Ollama
  • Ollama: Runs the Gemma3:4b model for inference
  • Database: SQLite database for storing conversation contexts

Features

  • MCP Protocol Features:

    • Tools: For context management
    • Resources: Expose conversation history
    • Prompts: Standard prompt templates
  • Model Support:

    • Use with Claude models via Claude Desktop or other MCP clients
    • Compatible with any client supporting the Model Context Protocol

MCP Protocol Compliance

This implementation strictly adheres to the official Model Context Protocol (MCP) specification published by Anthropic:

  1. JSON-RPC 2.0 Protocol: Uses the standard JSON-RPC 2.0 format for all communication, ensuring compatibility with other MCP clients and servers.

  2. Protocol Initialization: Correctly implements the /initialize endpoint with proper protocol version negotiation (2024-03-26) and client capability declarations.

  3. Tool Interface: Fully implements the tool calling protocol with all required annotations:

    • readOnlyHint: Indicates whether tools modify state
    • destructiveHint: Flags potentially destructive operations
    • idempotentHint: Marks operations that can be safely retried
  4. Resource Management: Implements the resource protocol for exposing conversation history with proper URIs and MIME types.

  5. Prompt Templates: Provides standard prompt templates following the MCP specification for common operations.

  6. Error Handling: Implements proper error responses with standardized error codes and messages as specified in the protocol.

  7. Security: Follows the security recommendations in the MCP specification regarding authentication and authorization.

  8. Versioning: Properly handles protocol versioning to ensure forward compatibility.

Our implementation is designed to be fully compatible with any MCP client, including Claude Desktop, VS Code extensions, and other tools that adhere to the official specification.

API Endpoints

MCP Server (port 3000)

The MCP Server implements the official MCP protocol endpoints:

  • Tools: /tools/list, /tools/call
  • Resources: /resources/list, /resources/read
  • Prompts: /prompts/list, /prompts/get
  • Core: /initialize

Middleware (port 8080)

  • POST /infer: Send user messages to Ollama/Gemma while storing context in the MCP server
  • GET /health: Check the health of the middleware and its connections

Requirements

  • Python 3.10+
  • Docker and Docker Compose
  • Ollama with Gemma3:4b model installed

Installation

Using setup script

chmod +x setup.sh
./setup.sh

The setup script will:

  1. Check for Docker installation
  2. Check for Ollama installation and pull the Gemma model if needed
  3. Build and start the Docker containers

Manual setup

# Clone the repository
git clone <repository-url>
cd simple-mcp-server

# Make sure Ollama is running with Gemma model
ollama pull gemma3:4b

# Build and run with Docker
docker-compose build
docker-compose up -d

Usage Example

Using the Middleware with Ollama/Gemma

You can interact with the system through the middleware, which will use Ollama/Gemma for inference while storing conversation context in the MCP server:

# Start a new conversation
curl -X POST http://localhost:8080/infer \
  -H "Content-Type: application/json" \
  -d '{"session_id": "user123", "content": "Hello, how are you today?"}'

# Continue the conversation with the same session ID
curl -X POST http://localhost:8080/infer \
  -H "Content-Type: application/json" \
  -d '{"session_id": "user123", "content": "What's the capital of France?"}'

Using the MCP Server Directly

The MCP server on port 3000 can be used with any MCP client, such as:

  • Claude Desktop App
  • VS Code GitHub Copilot
  • Cursor
  • And many other tools listed on the MCP Clients page

Docker Configuration

The project uses Docker to containerize two main components:

  • MCP Server Container: Provides the Anthropic MCP protocol on port 3000
  • Middleware Container: Connects to Ollama for Gemma model inference on port 8080

Testing

To test your installation:

# Run the included test script
python test_system.py

# Or use the shell script
./test.sh

License

This project is licensed under the MIT License - see the LICENSE file for details.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Tavily Mcp
CursorThe AI Code Editor
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
ChatWiseThe second fastest AI chatbot™
Serper MCP ServerA Serper MCP Server
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
DeepChatYour AI Partner on Desktop
Amap Maps高德地图官方 MCP Server
Playwright McpPlaywright MCP server
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
WindsurfThe new purpose-built IDE to harness magic
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.