Sponsored by Deepsite.site

Kaggle MCP (Model Context Protocol) Server

Created By
arrismo8 months ago
MCP server for Kaggle
Content

smithery badge Kaggle MCP Server

Kaggle MCP (Model Context Protocol) Server

This repository contains an MCP (Model Context Protocol) server (server.py) built using the fastmcp library. It interacts with the Kaggle API to provide tools for searching and downloading datasets, and a prompt for generating EDA notebooks.

Project Structure

  • server.py: The FastMCP server application. It defines resources, tools, and prompts for interacting with Kaggle.
  • .env.example: An example file for environment variables (Kaggle API credentials). Rename to .env and fill in your details.
  • requirements.txt: Lists the necessary Python packages.
  • pyproject.toml & uv.lock: Project metadata and locked dependencies for uv package manager.
  • datasets/: Default directory where downloaded Kaggle datasets will be stored.

Setup

  1. Clone the repository:

    git clone <repository-url>
    cd <repository-directory>
    
  2. Create a virtual environment (recommended):

    python -m venv venv
    source venv/bin/activate  # On Windows use `venv\Scripts\activate`
    # Or use uv: uv venv
    
  3. Install dependencies: Using pip:

    pip install -r requirements.txt
    

    Or using uv:

    uv sync
    
  4. Set up Kaggle API credentials:

    • Method 1 (Recommended): Environment Variables
      • Create .env file
      • Open the .env file and add your Kaggle username and API key:
        KAGGLE_USERNAME=your_kaggle_username
        KAGGLE_KEY=your_kaggle_api_key
        
      • You can obtain your API key from your Kaggle account page (Account > API > Create New API Token). This will download a kaggle.json file containing your username and key.
    • Method 2: kaggle.json file
      • Download your kaggle.json file from your Kaggle account.
      • Place the kaggle.json file in the expected location (usually ~/.kaggle/kaggle.json on Linux/macOS or C:\Users\<Your User Name>\.kaggle\kaggle.json on Windows). The kaggle library will automatically detect this file if the environment variables are not set.

Running the Server

  1. Ensure your virtual environment is active.
  2. Run the MCP server:
    uv run kaggle-mcp
    
    The server will start and register its resources, tools, and prompts. You can interact with it using an MCP client or compatible tools.

Running the Docker Container

1. Set up Kaggle API credentials

This project requires Kaggle API credentials to access Kaggle datasets.

  • Go to https://www.kaggle.com/settings and click "Create New API Token" to download your kaggle.json file.
  • Open the kaggle.json file and copy your username and key into a new .env file in the project root:
KAGGLE_USERNAME=your_username
KAGGLE_KEY=your_key

2. Build the Docker image

docker build -t kaggle-mcp-test .

3. Run the Docker container using your .env file

docker run --rm -it --env-file .env kaggle-mcp-test

This will automatically load your Kaggle credentials as environment variables inside the container.


Server Features

The server exposes the following capabilities through the Model Context Protocol:

Tools

  • search_kaggle_datasets(query: str):
    • Searches for datasets on Kaggle matching the provided query string.
    • Returns a JSON list of the top 10 matching datasets with details like reference, title, download count, and last updated date.
  • download_kaggle_dataset(dataset_ref: str, download_path: str | None = None):
    • Downloads and unzips files for a specific Kaggle dataset.
    • dataset_ref: The dataset identifier in the format username/dataset-slug (e.g., kaggle/titanic).
    • download_path (Optional): Specifies where to download the dataset. If omitted, it defaults to ./datasets/<dataset_slug>/ relative to the server script's location.

Prompts

  • generate_eda_notebook(dataset_ref: str):
    • Generates a prompt message suitable for an AI model (like Gemini) to create a basic Exploratory Data Analysis (EDA) notebook for the specified Kaggle dataset reference.
    • The prompt asks for Python code covering data loading, missing value checks, visualizations, and basic statistics.

Connecting to Claude Desktop

Go to Claude > Settings > Developer > Edit Config > claude_desktop_config.json to include the following:

{
  "mcpServers": {
    "kaggle-mcp": {
      "command": "kaggle-mcp",
      "cwd": "<path-to-their-cloned-repo>/kaggle-mcp"
    }
  }
}

Usage Example

An AI agent or MCP client could interact with this server like this:

  1. Agent: "Search Kaggle for datasets about 'heart disease'"
    • Server executes search_kaggle_datasets(query='heart disease')
  2. Agent: "Download the dataset 'user/heart-disease-dataset'"
    • Server executes download_kaggle_dataset(dataset_ref='user/heart-disease-dataset')
  3. Agent: "Generate an EDA notebook prompt for 'user/heart-disease-dataset'"
    • Server executes generate_eda_notebook(dataset_ref='user/heart-disease-dataset')
    • Server returns a structured prompt message.
  4. Agent: (Sends the prompt to a code-generating model) -> Receives EDA Python code.
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
DeepChatYour AI Partner on Desktop
CursorThe AI Code Editor
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Tavily Mcp
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Serper MCP ServerA Serper MCP Server
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Amap Maps高德地图官方 MCP Server
ChatWiseThe second fastest AI chatbot™
Playwright McpPlaywright MCP server
WindsurfThe new purpose-built IDE to harness magic
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。