Sponsored by Deepsite.site

Feyod MCP Server

Created By
jeroenvdmeer9 months ago
Model Content Protocol (MCP) server for querying Feyenoord football match data using natural language.
Content

Feyod MCP Server

FastMCP-based Model Context Protocol (MCP) server for querying Feyenoord football match data using natural language. Compatible with Claude Desktop and other MCP clients.

The server is publicly available at https://mcp.feyod.nl/mcp and a Docker container is available on Docker Hub (jeroenvdmeer/feyod-mcp).


Overview

This MCP server provides a natural language interface to query Feyod: Feyenoord Open Data. The underlying database is maintained in the feyod GitHub repository. You will need to obtain the latest SQL file from that repository to set up the required database.

The server uses LangChain to:

  1. Convert natural language questions into SQL queries (optionally leveraging few-shot examples for better accuracy).
  2. Validate the generated SQL.
  3. Attempt to fix invalid SQL using an LLM.
  4. Execute the valid SQL against a SQLite database.
  5. Return the raw query results.

LLM and embedding models are dynamically loaded based on configuration using a provider factory (llm_factory.py), allowing easy switching between providers like OpenAI, Google, etc.


Consumption

Using the Public Endpoint

The Feyod MCP server is publicly available at https://mcp.feyod.nl/mcp. You can connect to this endpoint from any MCP-compatible client, such as Claude Desktop.

Using the Docker Container

A Docker image of the Feyod MCP server is available on Docker Hub. You can pull and run it using the following commands:

  1. Pull the Docker image:

    docker pull jeroenvdmeer/feyod-mcp
    
  2. Run the Docker container: You will need to provide the necessary environment variables for the LLM provider and API key. You can also mount the feyod.db file if you want to use a local database instead of the one included in the image.

    docker run -p 8000:8000 \
      -e LLM_PROVIDER="your_llm_provider" \
      -e LLM_API_KEY="your_api_key" \
      jeroenvdmeer/feyod-mcp
    

    Replace your_llm_provider and your_api_key with your actual LLM configuration.

    To mount a local database file:

    docker run -p 8000:8000 \
      -e LLM_PROVIDER="your_llm_provider" \
      -e LLM_API_KEY="your_api_key" \
      -v <absolute_path_to_feyod_db>:/app/feyod/feyod.db \
      jeroenvdmeer/feyod-mcp
    

    Replace <absolute_path_to_feyod_db> with the absolute path to your feyod.db file on your host machine.


Tools

This server exposes MCP tools for querying the Feyenoord database. Tools are discoverable via the MCP protocol (tools/list).

  • query_feyod_database: Converts a natural language query about Feyenoord matches into a SQL query, executes it, and returns the SQL query and its result.

See the MCP Inspector or Claude Desktop's tool list for details.


Setup

  1. Clone repositories:

    # Clone this repo for the MCP server
    git clone https://github.com/jeroenvdmeer/feyod-mcp.git
    
    # Clone the Feyod database
    git clone https://github.com/jeroenvdmeer/feyod.git
    
    # Change directory into the MCP server
    cd feyod-mcp
    
  2. Create and activate a virtual environment (recommended: uv):

    Refer to https://docs.astral.sh/uv/ for the installation instructions of uv.

    uv venv
    .venv\Scripts\activate  # Windows
    # or
    source .venv/bin/activate  # macOS/Linux
    
  3. Install dependencies:

    # Using uv (recommended)
    uv add "mcp[cli]" langchain langchain-openai langchain-google-genai python-dotenv aiosqlite
    # Or using pip
    pip install -r requirements.txt
    
  4. Set up the database:

    # Change directory to the feyod directory with the SQL file
    cd ../feyod
    
    # Build the SQLite database using the SQL statements
    sqlite3 feyod.db < feyod.sql
    

Configuration

Create a .env file in the mcp directory with the following variables:

# Path to the SQLite database file (relative to mcp folder or absolute)
DATABASE_PATH="../feyod/feyod.db"

# Logging level (e.g., DEBUG, INFO, WARNING, ERROR)
LOG_LEVEL=INFO

# --- LLM Configuration ---
LLM_PROVIDER="google"  # or "openai", etc.
LLM_API_KEY="YOUR_API_KEY_HERE"
LLM_MODEL="gemini-2.5-flash-preview-05-20"

# --- Example Loading Configuration (Optional) ---
EXAMPLE_SOURCE="local"  # or "mongodb"
EXAMPLE_DB_CONNECTION_STRING=""
EXAMPLE_DB_NAME="feyenoord_data"
EXAMPLE_DB_COLLECTION="examples"

Notes:

  • Replace placeholder API key with your actual key.
  • Ensure the LLM_PROVIDER matches one defined in llm_factory.py.
  • Install the necessary LangChain integration package for your chosen provider (e.g., langchain-google-genai).
  • If using EXAMPLE_SOURCE="mongodb", configure MongoDB settings as above.

Running the Server

You can run the server in several ways:

  • Development mode (with hot reload and Inspector support):
    mcp dev main.py
    
  • Standard execution:
    python main.py
    # or
    mcp run main.py
    

The server will start and listen for MCP connections (stdio by default, or HTTP/SSE if configured).


Running with Docker

You can containerize the MCP server using the provided Dockerfile.

  1. Build the Docker image: Navigate to the mcp directory in your terminal and run the following command:

    docker build -t feyod-mcp:latest .
    

    This will build an image tagged feyod-mcp:latest.

  2. Run the Docker container: You can run the container, mapping the internal port 8000 to an external port (e.g., 8000) on your host machine. You will also need to mount the database file as a volume so the container can access it.

    docker run -p 8000:8000 -v <absolute_path_to_feyod_db>:/app/../feyod/feyod.db feyod-mcp:latest
    

    Replace <absolute_path_to_feyod_db> with the absolute path to your feyod.db file on your host machine.

    Alternatively, you can pass environment variables directly:

    docker run -p 8000:8000 -e DATABASE_PATH="/app/../feyod/feyod.db" -e LLM_PROVIDER="google" -e LLM_API_KEY="YOUR_API_KEY_HERE" -e LLM_MODEL="gemini-2.5-flash-preview-05-20" -v <absolute_path_to_feyod_db>:/app/../feyod/feyod.db feyod-mcp:latest
    

    Remember to replace the placeholder values with your actual configuration.

The server inside the container will start and listen on 0.0.0.0:8000.


Adding New LLM Providers

To add support for a new provider:

  1. Install Package: Install the required LangChain integration package (e.g., pip install langchain-anthropic).
  2. Update Factory: Edit llm_factory.py to add the provider.
  3. Update .env / README: Add the necessary API key to your .env file.

Dependencies

  • Python 3.10+
  • MCP Python SDK (mcp)
  • See requirements.txt for specific package dependencies.
  • Provider-specific packages (e.g., langchain-openai, langchain-google-genai).

Debugging and Troubleshooting

  • Use mcp dev main.py and the MCP Inspector for local testing.
  • Logs are written to stderr and can be viewed in Claude Desktop logs or your terminal.
  • For environment/config issues, check .env and Claude Desktop config.
  • See MCP Debugging Guide for more tips.

Security

MseeP.ai Security Assessment Badge

References


Disclaimer

This initiative is not affiliated with Feyenoord Rotterdam N.V. and therefore not an official Feyenoord product. The data provided through this server is unofficial and might be incorrect.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Serper MCP ServerA Serper MCP Server
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Tavily Mcp
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Playwright McpPlaywright MCP server
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
CursorThe AI Code Editor
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
ChatWiseThe second fastest AI chatbot™
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Amap Maps高德地图官方 MCP Server
DeepChatYour AI Partner on Desktop
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
WindsurfThe new purpose-built IDE to harness magic
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"